Science.gov

Sample records for accident analysis methodology

  1. Linguistic methodology for the analysis of aviation accidents

    NASA Technical Reports Server (NTRS)

    Goguen, J. A.; Linde, C.

    1983-01-01

    A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

  2. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Gregg L. Sharp; R. T. McCracken

    2003-06-01

    The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  3. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Sharp, G.L.; McCracken, R.T.

    2003-05-13

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  4. Accident characterization methodology

    SciTech Connect

    Camp, A.L.; Harper, F.T.

    1986-01-01

    The Nuclear Regulatory Commission (NRC) is preparing NUREG-1150 to examine the risk from a selected group of nuclear power plants. NUREG-1150 will provide technical bases for comparison of NRC research to industry results and resolution of numerous severe accident issues. In support of NUREG-1150, Sandia National Laboratories has directed the production of Level 3 Probabilistic Risk Assessments (PRAs) for the Surry, Sequoyah, Peach Bottom, and Grand Gulf nuclear power plants. The Accident Sequence Evaluation Program (ASEP) at Sandia has been responsible for the Level 1 portion of the analyses, which includes estimation of core damage frequency and characterization of the dominant sequences. The ASEP analyses are being documented in NUREG/CR-4550. The purpose of this paper is to briefly describe and evaluate the methodology utilized in these analyses. The methodology will soon be published in more detail as Reference 5. The results produced for NUREG/CR-4550 using this methodology are summarized in another paper to be presented at this conference.

  5. Risk Estimation Methodology for Launch Accidents.

    SciTech Connect

    Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

    2014-02-01

    As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

  6. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  7. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  8. Accident tolerant fuel analysis

    SciTech Connect

    Smith, Curtis; Chichester, Heather; Johns, Jesse; Teague, Melissa; Tonks, Michael Idaho National Laboratory; Youngblood, Robert

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

  9. Accident Tolerant Fuel Analysis

    SciTech Connect

    Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

  10. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  11. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    SciTech Connect

    Brereton, S.; Shinn, J.; Hesse, D; Kaninich, D.; Lazaro, M.; Mubayi, V.

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  12. Reactor Safety Gap Evaluation of Accident Tolerant Components and Severe Accident Analysis

    SciTech Connect

    Farmer, Mitchell T.; Bunt, R.; Corradini, M.; Ellison, Paul B.; Francis, M.; Gabor, John D.; Gauntt, R.; Henry, C.; Linthicum, R.; Luangdilok, W.; Lutz, R.; Paik, C.; Plys, M.; Rabiti, Cristian; Rempe, J.; Robb, K.; Wachowiak, R.

    2015-01-31

    The overall objective of this study was to conduct a technology gap evaluation on accident tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist, given the current state of light water reactor (LWR) severe accident research, and additionally augmented by insights obtained from the Fukushima accident. The ultimate benefit of this activity is that the results can be used to refine the Department of Energy’s (DOE) Reactor Safety Technology (RST) research and development (R&D) program plan to address key knowledge gaps in severe accident phenomena and analyses that affect reactor safety and that are not currently being addressed by the industry or the Nuclear Regulatory Commission (NRC).

  13. Accident progression event tree analysis for postulated severe accidents at N Reactor

    SciTech Connect

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. ); Medford, G.T. )

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  14. ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.

    PubMed

    Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

    2006-03-31

    In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage. PMID:16126337

  15. A methodology for the transfer of probabilities between accident severity categories

    SciTech Connect

    Whitlow, J. D.; Neuhauser, K. S.

    1991-01-01

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships. These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme.

  16. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  17. A systems approach to food accident analysis

    E-print Network

    Helferich, John D

    2011-01-01

    Food borne illnesses lead to 3000 deaths per year in the United States. Some industries, such as aviation, have made great strides increasing safety through careful accident analysis leading to changes in industry practices. ...

  18. An analysis of aircraft accidents involving fires

    NASA Technical Reports Server (NTRS)

    Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

    1975-01-01

    All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

  19. A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Apostolakis, G. and others

    1998-04-01

    This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

  20. Analyzing the uncertainty of simulation results in accident reconstruction with Response Surface Methodology.

    PubMed

    Zou, Tiefang; Cai, Ming; Du, Ronghua; Liu, Jike

    2012-03-10

    This paper is focused on the uncertainty of simulation results in accident reconstruction. The Upper and Lower Bound Method (ULM) and the Finite Difference Method (FDM), which can be easily applied in this field, are introduced firstly; the Response Surface Methodology (RSM) is then introduced into this field as an alternative methodology. In RSM, a sample set is firstly generated via uniform design; secondly, experiments are conducted according to the sample set with the help of simulation methods; thirdly, a response surface model is determined through regression analysis; finally, the uncertainty of simulation results can be analyzed using a combination of the response surface model and existing uncertainty analysis methods. It is later discussed in detail how to generate a sample set, how to calculate the range of simulation results and how to analyze the parameter sensitivity in RSM. Finally, the feasibility of RSM is validated by five cases. Moreover, the applicability of RSM, ULM and FDM in analyzing the uncertainty of simulation results is studied; the phenomena that ULM and FDM can hardly work while RSM can is found in the latter two cases. After an analysis of these five cases and the number of simulation runs required for each method, both advantages and disadvantages of these uncertainty analysis methods are indicated. PMID:21908115

  1. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  2. Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods

    SciTech Connect

    J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

    2000-07-31

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  3. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  4. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  5. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  6. Methodology Empirical Analysis

    E-print Network

    Thornton, Mitchell

    in the Bitcoin Ecosystem Marie Vasek Micah Thornton Tyler Moore Computer Science & Engineering, Southern Methodist University, USA, mathornton@smu.edu 1st Workshop on Bitcoin Research Barbados March 7, 2014 Marie Vasek & Micah Thornton & Tyler Moore Empirical Analysis of DoS Attacks in the Bitcoin Ecosystem #12

  7. Accident analysis for US fast burst reactors

    SciTech Connect

    Paternoster, R.; Flanders, M.; Kazi, H.

    1994-09-01

    In the US fast burst reactor (FBR) community there has been increasing emphasis and scrutiny on safety analysis and understanding of possible accident scenarios. This paper summarizes recent work in these areas that is going on at the different US FBR sites. At this time, all of the FBR facilities have or in the process of updating and refining their accident analyses. This effort is driven by two objectives: to obtain a more realistic scenario for emergency response procedures and contingency plans, and to determine compliance with changing regulatory standards.

  8. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  9. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  10. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  11. Canister storage building design basis accident analysis documentation

    SciTech Connect

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  12. Cross-analysis of hazmat road accidents using multiple databases.

    PubMed

    Trépanier, Martin; Leroux, Marie-Hélène; de Marcellis-Warin, Nathalie

    2009-11-01

    Road selection for hazardous materials transportation relies heavily on risk analysis. With risk being generally expressed as a product of the probability of occurrence and the expected consequence, one will understand that risk analysis is data intensive. However, various authors have noticed the lack of statistical reliability of hazmat accident databases due to the systematic underreporting of such events. Also, official accident databases alone are not always providing all the information required (economical impact, road conditions, etc.). In this paper, we attempt to integrate many data sources to analyze hazmat accidents in the province of Quebec, Canada. Databases on dangerous goods accidents, road accidents and work accidents were cross-analyzed. Results show that accidents can hardly be matched and that these databases suffer from underreporting. Police records seem to have better coverage than official records maintained by hazmat authorities. Serious accidents are missing from government's official databases (some involving deaths or major spills) even though their declaration is mandatory. PMID:19819367

  13. Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

    2012-01-01

    Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

  14. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  15. PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    , K; Jonathan Lowrie, J; David Thoman , D; Austin Keller , A

    2008-07-30

    Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

  16. Categorizing accident sequences in the external radiotherapy for risk analysis

    PubMed Central

    2013-01-01

    Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005

  17. Hazmat transport: a methodological framework for the risk analysis of marshalling yards.

    PubMed

    Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

    2007-08-17

    A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation. PMID:17418942

  18. TMI-2 accident: core heat-up analysis

    SciTech Connect

    Ardron, K.H.; Cain, D.G.

    1981-01-01

    This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

  19. Aircraft accidents.method of analysis

    NASA Technical Reports Server (NTRS)

    1937-01-01

    This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

  20. Development of Database for Accident Analysis in Indian Mines

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Guru Raghavendra Reddy, K.

    2015-08-01

    Mining is a hazardous industry and high accident rates associated with underground mining is a cause of deep concern. Technological developments notwithstanding, rate of fatal accidents and reportable incidents have not shown corresponding levels of decline. This paper argues that adoption of appropriate safety standards by both mine management and the government may result in appreciable reduction in accident frequency. This can be achieved by using the technology in improving the working conditions, sensitising workers and managers about causes and prevention of accidents. Inputs required for a detailed analysis of an accident include information on location, time, type, cost of accident, victim, nature of injury, personal and environmental factors etc. Such information can be generated from data available in the standard coded accident report form. This paper presents a web based application for accident analysis in Indian mines during 2001-2013. An accident database (SafeStat) prototype based on Intranet of the TCP/IP agreement, as developed by the authors, is also discussed.

  1. NASA's Accident Precursor Analysis Process and the International Space Station

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Lutomski, Michael

    2010-01-01

    This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

  2. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  3. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  4. Scenario analysis of freight vehicle accident risks in Taiwan.

    PubMed

    Tsai, Ming-Chih; Su, Chien-Chih

    2004-07-01

    This study develops a quantitative risk model by utilizing Generalized Linear Interactive Model (GLIM) to analyze the major freight vehicle accidents in Taiwan. Eight scenarios are established by interacting three categorical variables of driver ages, vehicle types and road types, each of which contains two levels. The database that consists of 2043 major accidents occurring between 1994 and 1998 in Taiwan is utilized to fit and calibrate the model parameters. The empirical results indicate that accident rates of freight vehicles in Taiwan were high in the scenarios involving trucks and non-freeway systems, while; accident consequences were severe in the scenarios involving mature drivers or non-freeway systems. Empirical evidences also show that there is no significant relationship between accident rates and accident consequences. This is to stress that safety studies that describe risk merely as accident rates rather than the combination of accident rates and consequences by definition might lead to biased risk perceptions. Finally, the study recommends using number of vehicle as an alternative of traffic exposure in commercial vehicle risk analysis. The merits of this would be that it is simple and thus reliable; meanwhile, the resulted risk that is termed as fatalities per vehicle could provide clear and direct policy implications for insurance practices and safety regulations. PMID:15094423

  5. An analysis of pileup accidents in highway systems

    NASA Astrophysics Data System (ADS)

    Chang, Jau-Yang; Lai, Wun-Cing

    2016-02-01

    Pileup accident is a multi-vehicle collision occurring in the lane and producing by successive following vehicles. It is a special collision on highway. The probability of the occurrence of pileup accident is lower than that of the other accidents in highway systems. However, the pileup accident leads to injuries and damages which are often serious. In this paper, we analyze the occurrence of pileup accidents by considering the three types of dangerous collisions in highway systems. We evaluate those corresponding to rear-end collision, lane-changing collision, and double lane-changing collision. We simulate four road driving strategies to investigate the relationships between different vehicle collisions and pileup accidents. In accordance with the simulation and analysis, it is shown that the double lane-changing collisions result in an increase of the occurrence of pileup accidents. Additionally, we found that the probability of the occurrence of pileup accidents can be reduced when the speeds of vehicles are suitably constrained in highway systems.

  6. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    SciTech Connect

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H.; Schulz, R.

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enable much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)

  7. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  8. Analysis of Credible Accidents for Argonaut Reactors

    SciTech Connect

    Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

    1981-04-01

    Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

  9. Analysis of tritium mission FMEF/FAA fuel handling accidents

    SciTech Connect

    Van Keuren, J.C.

    1997-11-18

    The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

  10. When a Crash Is Really an Accident: A Concept Analysis.

    PubMed

    Knechel, Nancy

    2015-01-01

    The debate over using the word accident has encouraged some groups to adopt the word crash, while other groups retain using accident. This article addresses the inconsistent and interchangeable use of the terms accident and crash. This conceptual analysis used a Critical Review Method, with Critical Theory as the theoretical framework. A literature search was conducted in MEDLINE and CINAHL for articles published through 2011. An extensive review of literature was followed by purposive sampling of articles published in 2011 across countries, disciplines, and contexts. Forty-seven articles were read in entirety, resulting in 2 themes for accident: intent and injury. Seven articles were critically analyzed for intent, injury, and underrepresented margins of society (5 articles using the term accident, 1 article using crash and accident interchangeably, and 1 using only crash). There was congruency on injury across all 7 articles. Results were mixed for intent and the incorporation of marginalized people. Although there is evidence that the use of the word accident should be maintained when the event could not have reasonably been prevented, the theoretical framework highlights this will likely perpetuate the conceptual confusion. The recommendation is to (1) identify the mechanism of injury, (2) identify event as intentional versus nonintentional, and (3) identify event as preventable versus nonpreventable. PMID:26574946

  11. OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT

    SciTech Connect

    KRIPPS, L.J.

    2005-02-18

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  12. MELCOR accident analysis for ARIES-ACT

    E-print Network

    Flow Flow #12;Fusion Safety Program · MELCOR is a code originally designed to model severe accident OB Blanket II CV465 CV300 Upper Divertor Lower Divertor HS330 HS320 HS340 HS322 HS332 HS400 HS410 HS411 HS401HS10026HS10028 HS10126HS10128 HS430 HS440 HS441 HS431 HS450 HS10228HS10229 CV705 HS7013 HS

  13. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.

  14. Hanford Waste Tank Bump Accident and Consequence Analysis

    SciTech Connect

    BRATZEL, D.R.

    2000-06-20

    This report provides a new evaluation of the Hanford tank bump accident analysis and consequences for incorporation into the Authorization Basis. The analysis scope is for the safe storage of waste in its current configuration in single-shell and double-shell tanks.

  15. Accident Sequence Evaluation Program: Human reliability analysis procedure

    SciTech Connect

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  16. Foucault's Analysis of Power's Methodologies

    E-print Network

    Scott, Gary Alan

    against the King is both more subtle and more devastating than would be an overt decapitation of the old monarch. Before he developed his conception of power, Foucault had already problematized the meaning of sovereignty in his archaeological writings... that the shift in the locus of sovereignty in the Classical Age—from the monarch to the more amorphous authority found in law—must still be supervened by the subsequent analysis of power that dislodges power from such juridico-discursive systems. In Foucault...

  17. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    SciTech Connect

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  18. MELCOR accident analysis for ARIES-ACT

    SciTech Connect

    Paul W. Humrickhouse; Brad J. Merrill

    2012-08-01

    We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

  19. Accident analysis of the windowless target system

    SciTech Connect

    Bianchi, F.; Ferri, R.

    2006-07-01

    Transmutation systems are able to reduce the radio-toxicity and amount of High-Level Wastes (HLW), which are the main concerns related to the peaceful use of nuclear energy, and therefore they should make nuclear energy more easily acceptable by population. A transmutation system consists of a sub-critical fast reactor, an accelerator and a Target System, where the spallation reactions needed to sustain the chain reaction take place. Three options were proposed for the Target System within the European project PDS-XADS (Preliminary Design Studies on an Experimental Accelerator Driven System): window, windowless and solid. This paper describes the constraints taken into account in the design of the windowless Target System for the large Lead-Bismuth-Eutectic cooled XADS and deals with the results of the calculations performed to assess the behaviour of the target during some accident sequences related to pump trips. (authors)

  20. Impact of meltdown-accident modeling developments on PWR analysis

    SciTech Connect

    Haskin, F.E.; Shaffer, C.J.

    1982-01-01

    Models recently incorporated into a new version of the MARCH computer code have altered earlier estimates of some parameter responses during meltdown accidents of PWRs. These models include improved fuel-coolant heat transfer models, new core radiative heat transfer models, improved in-vessel flashing models, improved models for burning of combustible gases in containment, and CORCON1 models for core-concrete interactions. Studies performed with these models have been used to identify and, in some cases, bound meltdown accident analysis uncertainties.

  1. Core Disruptive Accident Analysis using ASTERIA-FBR

    NASA Astrophysics Data System (ADS)

    Ishizu, Tomoko; Endo, Hiroshi; Yamamoto, Toshihisa; Tatewaki, Isao

    2014-06-01

    JNES is developing a core disruptive accident analysis code, ASTERIA-FBR, which tightly couples the thermal-hydraulics and the neutronics to simulate the core behavior during core disruptive accidents of fast breeder reactors (FBRs). ASTERIA-FBR consists of the three-dimensional thermal-hydraulics calculation module: CONCORD, the fuel pin behavior calculation module: FEMAXI-FBR, and the space-time neutronics module: Dynamic-GMVP or PARTISN/RKIN. This paper describes a comparison between characteristics of GMVP and PARTISN and summarizes the challenging issues on applying Dynamic-GMVP to the calculation against unprotected loss-of-flow (ULOF) event which is a typical initiator of core disruptive accident of FBR. The statistical error included in the calculation results may affect the super-prompt criticality during ULOF event and thus the amount of released energy.

  2. Three dimensional effects in analysis of PWR steam line break accident

    E-print Network

    Tsai, Chon-Kwo

    A steam line break accident is one of the possible severe abnormal transients in a pressurized water reactor. It is required to present an analysis of a steam line break accident in the Final Safety Analysis Report (FSAR) ...

  3. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  4. INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS

    SciTech Connect

    D.A. Kalinich

    1999-09-27

    Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

  5. Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident

    NASA Technical Reports Server (NTRS)

    Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

    1994-01-01

    Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

  6. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation. PMID:19819365

  7. Mass Spectrometry Methodology in Lipid Analysis

    PubMed Central

    Li, Lin; Han, Juanjuan; Wang, Zhenpeng; Liu, Jian’an; Wei, Jinchao; Xiong, Shaoxiang; Zhao, Zhenwen

    2014-01-01

    Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics technology greatly push forward the study of lipidomics. Among them, mass spectrometry (MS) is the most important technology for lipid analysis. In this review, the methodology based on MS for lipid analysis was introduced. It is believed that along with the rapid development of MS and its further applications to lipid analysis, more functional lipids will be identified as biomarkers and therapeutic targets and for the study of the mechanisms of disease. PMID:24921707

  8. Analysis of typical WWER-1000 severe accident scenarios

    SciTech Connect

    Sorokin, Yu.S.; Shchekoldin, V.V.; Borisov, L.N.; Fil, N.S.

    2004-07-01

    At present in EDO 'Gidropress' there is a certain experience of performing the analyses of severe accidents of reactor plant with WWER with application of domestic and foreign codes. Important data were also obtained by the results of calculation modeling of integrated experiments with fuel assembly melting comprising a real fuel. Systematization and consideration of these data in development and assimilation of codes are extremely important in connection with large uncertainty still existing in understanding and adequate description of phenomenology of severe accidents. The presented report gives a comparison of analysis results of severe accidents of reactor plant with WWER-1000 for two typical scenarios made by using American MELCOR code and the Russian RATEG/SVECHA/HEFEST code. The results of calculation modeling are compared using above codes with the data of experiment FPT1 with fuel assembly melting comprising a real fuel, which has been carried out at the facility Phebus (France). The obtained results are considered in the report from the viewpoint of: - adequacy of results of calculation modeling of separate phenomena during severe accidents of RP with WWER by using the above codes; - influence of uncertainties (degree of details of calculation models, choice of parameters of models etc.); - choice of those or other setup variables (options) in the used codes; - necessity of detailed modeling of processes and phenomena as applied to design justification of safety of RP with WWER. (authors)

  9. Offsite radiological consequence analysis for the bounding flammable gas accident

    SciTech Connect

    CARRO, C.A.

    2003-03-19

    The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. As will be shown, the consequences of a detonation in either an SST or a double-shell tank (DST) are approximately equal. A detonation in an SST was selected as the bounding condition because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are generally greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  10. Accident analysis of heavy water cooled thorium breeder reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Takaki, Naoyuki

    2015-04-01

    Thorium has lately attracted considerable attention because it is accumulating as a by-product of large scale rare earth mining. The objective of research is to analyze transient behavior of a heavy water cooled thorium breeder that is designed by Tokai University and Tokyo Institute of Technology. That is oxide fueled, PWR type reactor with heavy water as primary coolant. An example of the optimized core has relatively small moderator to fuel volume ratio (MFR) of 0.6 and the characteristics of the core are burn-up of 67 GWd/t, breeding ratio of 1.08, burn-up reactivity loss during cycles of < 0.2% dk/k, and negative coolant reactivity coefficient. One of the nuclear reactor accidents types examined here is Unprotected Transient over Power (UTOP) due to withdrawing of the control rod that result in the positive reactivity insertion so that the reactor power will increase rapidly. Another accident type is Unprotected Loss of Flow (ULOF) that caused by failure of coolant pumps. To analyze the reactor accidents, neutron distribution calculation in the nuclear reactor is the most important factor. The best expression for the neutron distribution is the Boltzmann transport equation. However, solving this equation is very difficult so that the space-time diffusion equation is commonly used. Usually, space-time diffusion equation is solved by employing a point kinetics approach. However, this approach is less accurate for a spatially heterogeneous nuclear reactor and the nuclear reactor with quite large reactivity input. Direct method is therefore used to solve space-time diffusion equation which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference method is solved by using iterative methods. The indication of UTOP accident is decreasing macroscopic absorption cross-section that results large external reactivity, and ULOF accident is indicated by decreasing coolant flow. The power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  11. Accident analysis of heavy water cooled thorium breeder reactor

    SciTech Connect

    Yulianti, Yanti; Su’ud, Zaki; Takaki, Naoyuki

    2015-04-16

    Thorium has lately attracted considerable attention because it is accumulating as a by-product of large scale rare earth mining. The objective of research is to analyze transient behavior of a heavy water cooled thorium breeder that is designed by Tokai University and Tokyo Institute of Technology. That is oxide fueled, PWR type reactor with heavy water as primary coolant. An example of the optimized core has relatively small moderator to fuel volume ratio (MFR) of 0.6 and the characteristics of the core are burn-up of 67 GWd/t, breeding ratio of 1.08, burn-up reactivity loss during cycles of < 0.2% dk/k, and negative coolant reactivity coefficient. One of the nuclear reactor accidents types examined here is Unprotected Transient over Power (UTOP) due to withdrawing of the control rod that result in the positive reactivity insertion so that the reactor power will increase rapidly. Another accident type is Unprotected Loss of Flow (ULOF) that caused by failure of coolant pumps. To analyze the reactor accidents, neutron distribution calculation in the nuclear reactor is the most important factor. The best expression for the neutron distribution is the Boltzmann transport equation. However, solving this equation is very difficult so that the space-time diffusion equation is commonly used. Usually, space-time diffusion equation is solved by employing a point kinetics approach. However, this approach is less accurate for a spatially heterogeneous nuclear reactor and the nuclear reactor with quite large reactivity input. Direct method is therefore used to solve space-time diffusion equation which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference method is solved by using iterative methods. The indication of UTOP accident is decreasing macroscopic absorption cross-section that results large external reactivity, and ULOF accident is indicated by decreasing coolant flow. The power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  12. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  13. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  14. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare.

  15. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    SciTech Connect

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  16. Analysis of drug combinations: current methodological landscape

    PubMed Central

    Foucquier, Julie; Guedj, Mickael

    2015-01-01

    Combination therapies exploit the chances for better efficacy, decreased toxicity, and reduced development of drug resistance and owing to these advantages, have become a standard for the treatment of several diseases and continue to represent a promising approach in indications of unmet medical need. In this context, studying the effects of a combination of drugs in order to provide evidence of a significant superiority compared to the single agents is of particular interest. Research in this field has resulted in a large number of papers and revealed several issues. Here, we propose an overview of the current methodological landscape concerning the study of combination effects. First, we aim to provide the minimal set of mathematical and pharmacological concepts necessary to understand the most commonly used approaches, divided into effect-based approaches and dose–effect-based approaches, and introduced in light of their respective practical advantages and limitations. Then, we discuss six main common methodological issues that scientists have to face at each step of the development of new combination therapies. In particular, in the absence of a reference methodology suitable for all biomedical situations, the analysis of drug combinations should benefit from a collective, appropriate, and rigorous application of the concepts and methods reviewed here. PMID:26171228

  17. Requirements Analysis in the Value Methodology

    SciTech Connect

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  18. Analysis of Three Mile Island-Unit 2 accident

    SciTech Connect

    Not Available

    1980-03-01

    The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

  19. Injury pattern analysis of helicopter wire strike accidents (-Gz load).

    PubMed

    Farr, W D; Ruehle, C J; Posey, D M; Wagner, G N

    1985-12-01

    Injury patterns in rotary wing aircraft wire strike accidents were reviewed to determine mechanisms of injury. It was found that U.S. Army Safety Center data showed that between 1 January 1974 and 31 August 1981 there were 167 wire strikes involving Army helicopters which resulted in 60 injuries and 34 fatalities at a cost of $12,809,100. Updated data on all military rotary wing aircraft accidents investigated between 1978 and 1982 were screened by the Division of Aerospace Pathology to determine the mechanisms of injury to flight deck personnel. From 13 December 1978 to 23 June 1982, three types of rotary wing aircraft were in eight fatal accidents. These mishaps accounted for 28 casualties: 14 fatalities and 14 injuries. Aviators comprised 64.4% of the fatalities. Injury pattern analysis showed 100% had major head and neck injuries with 66% having basilar skull fractures. Two-thirds had associated mandibular fractures or evidence of impact forces transmitted through the mandible to the skull. The same number had wedge-shaped chin lacerations from impact with the cyclic control stick. We postulate transmission of lethal impact forces primarily in the +Gz direction through the mandible to the skull. This suggests either improper use and/or failure of the seat and restraint systems. PMID:4084179

  20. BESAFE II: Accident safety analysis code for MFE reactor designs

    NASA Astrophysics Data System (ADS)

    Sevigny, Lawrence Michael

    The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications to BESAFE II is discussed in Chapter 6, for example, by adding additional environmental indices such as a waste disposal index. The biggest improvement to BESAFE II would be an increase in the database of activation product mobilization for a larger spectrum of fusion reactor materials. The ultimate goal we have is for BESAFE II to become part of a systems design program which would include economic factors and allow both safety and the cost of electricity to influence design.

  1. Traffic accident analysis using GIS: a case study of Kyrenia City

    NASA Astrophysics Data System (ADS)

    Kara, Can; Akçit, Nuhcan

    2015-06-01

    Traffic accidents are causing major deaths in urban environments, so analyzing locations of the traffic accidents and their reasons is crucial. In this manner, patterns of accidents and hotspot distribution are analyzed by using geographic information technology. Locations of the traffic accidents in the years 2011, 2012 and 2013 are combined to generate the kernel distribution map of Kyrenia City. This analysis aims to find high dense intersections and segments within the city. Additionally, spatial autocorrelation methods Local Morans I and Getis-Ord Gi are employed . The results are discussed in detail for further analysis. Finally, required changes for numerous intersections are suggested to decrease potential risks of high dense accident locations.

  2. A general methodology for population analysis

    NASA Astrophysics Data System (ADS)

    Lazov, Petar; Lazov, Igor

    2014-12-01

    For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ?, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ?, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn, n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

  3. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  4. Extension of ship accident analysis to multiple-package shipments

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.

    1997-11-01

    Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10`s or 100`s of individual packagings is compromised. The previous analysis involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship`s bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence.

  5. Integral Test and Engineering Analysis of Coolant Depletion During a Large-Break Loss-of-Coolant Accident

    SciTech Connect

    Kim, Yong Soo; Park, Chang Hwan; Bae, Byoung Uhn; Park, Goon Cherl; Suh, Kune Yull; Lee, Un Chul

    2005-02-15

    This study concerns the development of an integrated calculation methodology with which to continually and consistently analyze the progression of an accident from the design-basis accident phase via core uncovery to the severe accident phase. The depletion rate of reactor coolant inventory was experimentally investigated after the safety injection failure during a large-break loss-of-coolant accident utilizing the Seoul National University Integral Test Facility (SNUF), which is scaled down to 1/6.4 in length and 1/178 in area from the APR1400 [Advanced Power Reactor 1400 MW(electric)]. The experimental results showed that the core coolant inventory decreased five times faster before than after the extinction of sweepout in the reactor downcomer, which is induced by the incoming steam from the intact cold legs. The sweepout occurred on top of the spillover from the downcomer region and expedited depletion of the core coolant inventory. The test result was simulated with the MAAP4 severe accident analysis code. The calculation results of the original MAAP4 deviated from the test data in terms of coolant inventory distribution in the test vessel. After the calculation algorithm of coolant level distribution was improved by including the subroutine of pseudo pressure buildup, which accounts for the differential pressure between the core and downcomer in MAAP4, the core melt progression was delayed by hundreds of seconds, and the code prediction was in reasonable agreement with the overall behavior of the SNUF experiment.

  6. A Review of Citation Analysis Methodologies for Collection Management

    ERIC Educational Resources Information Center

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  7. Decontamination analysis of the NUWAX-83 accident site using DECON

    SciTech Connect

    Tawil, J.J.

    1983-11-01

    This report presents an analysis of the site restoration options for the NUWAX-83 site, at which an exercise was conducted involving a simulated nuclear weapons accident. This analysis was performed using a computer program deveoped by Pacific Northwest Laboratory. The computer program, called DECON, was designed to assist personnel engaged in the planning of decontamination activities. The many features of DECON that are used in this report demonstrate its potential usefulness as a site restoration planning tool. Strategies that are analyzed with DECON include: (1) employing a Quick-Vac option, under which selected surfaces are vacuumed before they can be rained on; (2) protecting surfaces against precipitation; (3) prohibiting specific operations on selected surfaces; (4) requiring specific methods to be used on selected surfaces; (5) evaluating the trade-off between cleanup standards and decontamination costs; and (6) varying of the cleanup standards according to expected exposure to surface.

  8. Geographical information systems aided traffic accident analysis system case study: city of Afyonkarahisar.

    PubMed

    Erdogan, Saffet; Yilmaz, Ibrahim; Baybura, Tamer; Gullu, Mevlut

    2008-01-01

    Geographical Information System (GIS) technology has been a popular tool for visualization of accident data and analysis of hot spots in highways. Many traffic agencies have been using GIS for accident analysis. Accident analysis studies aim at the identification of high rate accident locations and safety deficient areas on the highways. So, traffic officials can implement precautionary measures and provisions for traffic safety. Since accident reports are prepared in textual format in Turkey, this situation makes it difficult to analyze accident results. In our study, we developed a system transforming these textual data to tabular form and then this tabular data were georeferenced onto the highways. Then, the hot spots in the highways in Afyonkarahisar administrative border were explored and determined with two different methods of Kernel Density analysis and repeatability analysis. Subsequently, accident conditions at these hot spots were examined. We realized that the hot spots determined with two methods reflect really problematic places such as cross roads, junction points etc. Many of previous studies introduced GIS only as a visualization tool for accident locations. The importance of this study was to use GIS as a management system for accident analysis and determination of hot spots in Turkey with statistical analysis methods. PMID:18215546

  9. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  10. An analysis of evacuation options for nuclear accidents

    SciTech Connect

    Tawil, J.J.; Strenge, D.L.; Schultz, R.W.

    1987-11-01

    In this report we consider the threat posed by the accidental release of radionuclides from a nuclear power plant. The objective is to establish relationships between radiation dose and the cost of evacuation under a wide variety of conditions. The dose can almost always be reduced by evacuating the population from a larger area. However, extending the evacuation zone outward will cause evacuation costs to increase. The purpose of this analysis was to provide the Environmental Protection Agency (EPA) a data base for evaluating whether implementation costs and risks averted could be used to justify evacuation at lower doses. The procedures used and results of these analyses are being made available as background information for use by others. We develop cost/dose relationships for 54 scenarios that are based upon the severity of the reactor accident, meteorological conditions during the release of radionuclides into the environment, and the angular width of the evacuation zone. The 54 scenarios are derived from combinations of three accident severity levels, six meteorological conditions and evacuation zone widths of 70{degree}, 90{degree}, and 180{degree}.

  11. Toward a Methodology of Stakeholder Analysis.

    ERIC Educational Resources Information Center

    Welsh, Thomas; McGinn, Noel

    1997-01-01

    Proposes a methodology for the comprehensive identification of stakeholders and their changing definitions and roles. The methodology links stakeholders to system tasks, management activities, and the best moments for interventions. Examples are taken from education, but have applications elsewhere. The method is used to analyze educational reform…

  12. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  13. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  14. Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System

    SciTech Connect

    WILLIAMS, J.C.

    2000-09-15

    Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

  15. Potential Threats from a Likely Nuclear Power Plant Accident: a Climatological Trajectory Analysis

    E-print Network

    Chen, Shu-Hua

    Potential Threats from a Likely Nuclear Power Plant Accident: a Climatological Trajectory Analysis at the Metsamor Nuclear Power Plant would influence all of Turkey. Furthermore, vulnerable regions in Turkey after . Trajectory analysis . MM5 Tracer Model 1 Introduction The hazardous effects of a nuclear power plant accident

  16. Accident sequence analysis for sites producing and storing explosives.

    PubMed

    Papazoglou, Ioannis A; Aneziris, Olga; Konstandinidou, Myrto; Giakoumatos, Ieronymos

    2009-11-01

    This paper presents a QRA-based approach for assessing and evaluating the safety of installations handling explosive substances. Comprehensive generic lists of immediate causes and initiating events of detonation and deflagration of explosive substances as well as safety measures preventing these explosions are developed. Initiating events and corresponding measures are grouped under the more general categories of explosion due to shock wave, explosion due to mechanical energy, thermal energy, electrical energy, chemical energy, and electromagnetic radiation. Generic accident sequences are developed using Event Trees. This analysis is adapted to plant-specific conditions and potentially additional protective measures are rank-ordered in terms of the induced reduction in the frequency of explosion, by including also uncertainty. This approach has been applied to 14 plants in Greece with very satisfactory results. PMID:19819362

  17. Aircraft Accident Prevention: Loss-of-Control Analysis

    NASA Technical Reports Server (NTRS)

    Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

    2009-01-01

    The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

  18. Hazard categorization and accident analysis techniques for compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports

    SciTech Connect

    1992-12-31

    The purpose of this DOE Standard is to establish guidance for facility managers and Program Secretarial Officers (PSOs) and thereby help them to comply consistently and more efficiently with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports. To this end, this guidance provides the following practical information: (1) The threshold quantities of radiological material inventory below which compliance with DOE Order 5480.23 is not required. (2) The level of effort to develop the program plan and schedule required in Section 9.b. (2) of the Order, and information for making a preliminary assessment of facility hazards. (3) A uniform methodology for hazard categorization under the Order. (4) Insight into the ''graded approach'' for SAR development, especially in hazard assessment and accident analysis techniques. Individual PSOs may develop additional guidance addressing safety requirements for facilities which fall below the threshold quantities specified in this document.

  19. The accident analysis of mobile mine machinery in Indian opencast coal mines.

    PubMed

    Kumar, R; Ghosh, A K

    2014-01-01

    This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines. PMID:23324038

  20. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    SciTech Connect

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-10-15

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

  1. An analysis of accident data for franchised public buses in Hong Kong.

    PubMed

    Evans, W A; Courtney, A J

    1985-10-01

    This paper analyses data on accidents involving franchised public buses operating in Hong Kong. The data were obtained from the Royal Hong Kong Police, the Hong Kong Government Transport Department, the two major franchised bus operators and international sources. The analysis includes an international comparison of accidents with emphasis on the situation in Hong Kong compared to urban areas in the United Kingdom. An attempt has been made to identify the characteristics of bus accidents; accident incidence has been related to time of day, day of the week, time of year, weather conditions, driver's age and experience, hours on duty and policy-reported cause. The results indicate that Hong Kong has a high accident rate compared to Japan, the U.K. and the U.S.A., with particularly high pedestrian involvement rates. Bus accidents peak at around 9:00 AM and 4:00 PM but the accident rate is high throughout the day. Monday and Saturday appear to have a higher than average accident rate. The variability of accident rate throughout the year does not seem to be significant and the accident rate does not appear to be influenced by weather conditions. Older, more experienced drivers generally have a safer driving record than their younger, less experienced colleagues. Accident occurrence is related to the time the driver has been on duty. The paper questions the reliability of police-reported accident causation data and suggests improvements in the design of the accident report form and in the training of police investigators. The relevance of the Hong Kong study for accident research in general is also discussed. PMID:4096796

  2. Improved Methodology Application for 12-Rad Analysis in a Shielded Facility at SRS

    SciTech Connect

    Paul, P.

    2003-01-31

    The DOE Order 420.1 requires establishing 12-rad evacuation zone boundaries and installing Criticality Accident Alarm System (CAAS) per ANS-8.3 standard for facilities having a probability of criticality greater than 10-6 per year. The H-Canyon at the Savannah River Site (SRS) is one of the reprocessing facilities where SRS reactor fuels, research reactor fuels, and other fissile materials are processed and purified using a modified Purex process called H-Modified or HM Process. This paper discusses an improved methodology for 12-rad zone analysis and its implementation within this large shielded facility that has a large variety of criticality sources and scenarios.

  3. [Snowboarding accidents in the Alps. Assessment of risk, analysis of the accidents and injury profile].

    PubMed

    Berghold, F; Seidl, A M

    1991-03-01

    It is a common belief that snowboarding might carry a higher risk of accident than alpine skiing. In order to prove this suspicion but also to figure out a basic knowledge to develop special security precautions for snowboarding 204 snowboarding-accidents in the Alps have been registered and analysed. The major purpose of this study was to find out the crucial points of risks of this new way of performing winter sports and to compare these risks with the risk-profile of alpine skiing. These have been the main results: More than two thirds of all accidents happen on icy or hard courses. Few more than the half of all injuries affect the lower extremity, one third the upper extremity whereby injuries of the ankle, knee, shoulder and hand emphasize. Strains and fractures are placed in the foreground. Two thirds of all injuries of the lower extremity affect the front of the leg, whereby nearly two thirds of all leg-injuries result from a mechanism which is a combination between bending forward and torsions. Depending on the different types of boot characteristic crucial points have been shown with injuries of either the ankle or the knee. As a principal demand for better safety at least for beginners a functional safety-binding should be developed. Regarding to the most common critics that snowboarding will increase the incidence of collisions (on-course) and avalanche-accidents (off-course), this study could fortunately not prove these suspicions. PMID:2028247

  4. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    SciTech Connect

    Su'ud, Zaki; Anshari, Rio

    2012-06-06

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  5. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  6. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  7. PWR integrated safety analysis methodology using multi-level coupling algorithm

    NASA Astrophysics Data System (ADS)

    Ziabletsev, Dmitri Nickolaevich

    Coupled three-dimensional (3D) neutronics/thermal-hydraulic (T-H) system codes give a unique opportunity for a realistic modeling of the plant transients and design basis accidents (DBA) occurring in light water reactors (LWR). Examples of such DBAs are the rod ejection accidents (REA) and the main steam line break (MSLB) that constitute the bounding safety problems for pressurized water reactors (PWR). These accidents involve asymmetric 3D spatial neutronic and T-H effects during the course of the transients. The thermal margins (the peak fuel temperature, and departure from nucleate boiling ratio (DNBR)) are the measures of safety at a particular transient and need to be evaluated as accurate as possible. Modern 3D neutronics/T-H coupled codes estimate the safety margins coarsely on an assembly level, i.e. for an average fuel pin. More accurate prediction of the safety margins requires the evaluation of the transient fuel rod response involving locally coupled neutronics/T-H calculations. The proposed approach is to perform an on-line hot-channel safety analysis not for the whole core but for a selected local region, for example for the highest power loaded fuel assembly. This approach becomes feasible if an on-line algorithm capable to extract the necessary input data for a sub-channel module is available. The necessary input data include the detailed pin-power distributions and the T-H boundary conditions for each sub-channel in the considered problem. Therefore, two potential challenges are faced in the development of refined methodology for evaluation of local safety parameters. One is the development of an efficient transient pin-power reconstruction algorithm with a consistent cross-section modeling. The second is the development of a multi-level coupling algorithm for the T-H boundary and feed-back data exchange between the sub-channel module and the main 3D neutron kinetics/T-H system code, which already uses one level of coupling scheme between 3D neutronics and core thermal-hydraulics models. The major accomplishment of the thesis is the development of an integrated PWR safety analysis methodology with locally refined safety evaluations. This involved introduction of an improved method capable of efficiently restoring the fine pin-power distribution with a high degree of accuracy. In order to apply the methodology to evaluate the safety margins on a pin level, a refined on-line hot channel model was developed accounting for the cross-flow effects. Finally, this methodology was applied to best estimate safety analysis to more accurately calculate the thermal safety margins occurring during a design basis accident in PWR.

  8. GPHS-RTG launch accident analysis for Galileo and Ulysses

    SciTech Connect

    Bradshaw, C.T. )

    1991-01-01

    This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. The National Aeronautics and Space Administration (NASA) provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

  9. An analysis of three weather-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Fujita, T. T.; Caracena, F.

    1977-01-01

    Two aircraft accidents in 1975, one at John F. Kennedy International Airport in New York City on 24 June and the other at Stapleton International Airport in Denver on 7 August, were examined in detail. A third accident on 23 June 1976 at Philadelphia International Airport is being investigated. Amazingly, there was a spearhead echo just to the north of each accident site. The echoes formed from 5 to 50 min in advance of the accident and moved faster than other echoes in the vicinity. These echoes were photographed by National Weather Service radars, 130-205 km away. At closer ranges, however, one or more circular echoes were depicted by airborne and ground radars. These cells were only 3-5 km in diameter, but they were accompanied by downdrafts of extreme intensity, called downbursts. All accidents occurred as aircraft, either descending or climbing, lost altitude while experiencing strong wind shear inside downburst cells.

  10. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

  11. Probabilistic analysis of accident precursors in the nuclear industry.

    PubMed

    Hulsmans, M; De Gelder, P

    2004-07-26

    Feedback of operating experience has always been an important issue in the nuclear industry. A probabilistic safety analysis (PSA) can be used as a tool to analyse how an operational event might have developed adversely in order to obtain a quantitative assessment of the safety significance of the event. This process is called PSA-based event analysis (PSAEA). A comprehensive set of PSAEA guidelines was developed by an international project. The main characteristics of this methodology are summarised. This approach to analyse incidents can be used to meet different objectives of utilities or nuclear regulators. The paper describes the main objectives and the experiences of the Belgian nuclear regulatory organisation AVN with the application of PSA-based event analysis. Some interesting aspects of the process of PSAEA are further developed and underlined. Several case studies are discussed and an overview of the obtained results is given. Finally, the interest of a broad and interactive forum on PSAEA is highlighted. PMID:15231351

  12. Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny

    E-print Network

    Kwatny, Harry G.

    Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny , Jean-Etienne T. Dongmo NASA Langley Research Center, MS 161, Hampton, VA, 23681. The majority of fatal aircraft accidents the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft

  13. Analysis of Construction Accidents in Turkey and Responsible Parties

    PubMed Central

    GÜRCANLI, G. Emre; MÜNGEN, U?ur

    2013-01-01

    Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  14. Textual Analysis in Mass Communication Studies: Theory and Methodology.

    ERIC Educational Resources Information Center

    Curtin, Patricia A.

    This study examines textual analysis methodology as applied to mass communication studies. It focuses particularly on the theoretical basis of textual analysis, the analytical process, and congruent theoretical perspectives. Although the term "textual analysis" is often used generically, this study differentiates textual analysis as developed by…

  15. A method for modeling and analysis of directed weighted accident causation network (DWACN)

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Ding, Jing

    2015-11-01

    Using complex network theory to analyze accidents is effective to understand the causes of accidents in complex systems. In this paper, a novel method is proposed to establish directed weighted accident causation network (DWACN) for the Rail Accident Investigation Branch (RAIB) in the UK, which is based on complex network and using event chains of accidents. DWACN is composed of 109 nodes which denote causal factors and 260 directed weighted edges which represent complex interrelationships among factors. The statistical properties of directed weighted complex network are applied to reveal the critical factors, the key event chains and the important classes in DWACN. Analysis results demonstrate that DWACN has characteristics of small-world networks with short average path length and high weighted clustering coefficient, and display the properties of scale-free networks captured by that the cumulative degree distribution follows an exponential function. This modeling and analysis method can assist us to discover the latent rules of accidents and feature of faults propagation to reduce accidents. This paper is further development on the research of accident analysis methods using complex network.

  16. Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company

    PubMed Central

    BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  17. Analysis of Loss-of-Coolant Accidents in the NBSR

    SciTech Connect

    Baek J. S.; Cheng L.; Diamond, D.

    2014-05-23

    This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

  18. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    SciTech Connect

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  19. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  20. Modeling control room crews for accident sequence analysis

    E-print Network

    Huang, Y. (Yuhao)

    1991-01-01

    This report describes a systems-based operating crew model designed to simulate the behavior of an nuclear power plant control room crew during an accident scenario. This model can lead to an improved treatment of potential ...

  1. Protein MAS NMR methodology and structural analysis of protein assemblies

    E-print Network

    Bayro, Marvin J

    2010-01-01

    Methodological developments and applications of solid-state magic-angle spinning nuclear magnetic resonance (MAS NMR) spectroscopy, with particular emphasis on the analysis of protein structure, are described in this thesis. ...

  2. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  3. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  4. Methodological Aspects Regarding The Organizational Stress Analysis

    NASA Astrophysics Data System (ADS)

    Irimie, Sabina; Pricope (Muntean), Lumini?a Doina; Pricope, Sorin; Irimie, Sabin Ioan

    2015-07-01

    This work presents a research of methodology in occupational stress analyse in the educational field, as a part of a larger study. The objectives of the work are in finding accents in existence of significant relations between stressors and effects, meaning the differences between the indicators of occupational stress to teaching staff in primary and gymnasium school, taking notice of each specific condition: the institution as an entity, the working community, the discipline he/she is teaching others, the geographic and administrative district (urban/rural) and the quantification of stress level.

  5. Action Plan for updated Chapter 15 Accident Analysis in the SRS Production Reactor SAR

    SciTech Connect

    Hightower, N.T. III; Burnett, T.W.

    1989-11-15

    This report describes the Action Plan for the upgrade of the Chapter 15 Accident Analysis in the SRS Production Reactor SAR required for K-Restart. This Action Plan will be updated periodically to reflect task accomplishments and issue resolutions.

  6. Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

    2005-01-01

    NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

  7. 76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-24

    ...revision of the Federal Need Analysis Methodology for the 2012-2013...84.379]. Federal Need Analysis Methodology for the 2012-2013...Federal Perkins Loan, Federal Work-Study, Federal Supplemental...statutory ``Federal Need Analysis Methodology'' to...

  8. Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor

    SciTech Connect

    Yulianti, Yanti; Su'ud, Zaki; Waris, Abdul; Khotimah, S. N.; Shafii, M. Ali

    2010-12-23

    The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

  9. Swimming pool immersion accidents: an analysis from the Brisbane drowning study.

    PubMed

    Pearn, J H; Nixon, J

    1977-03-26

    An analysis of a consecutive series of 66 swimming pool immersion accidents is presented; 74% of these occurred in in-ground swimming pools. The estimated accident rate per pool is five times greater for in-ground pools compared with above-ground pools, where pools are inadequately fenced. Backyard swimming pools account for 74% of pool acidents. Motel and caravan park pools account for 9% of childhood immersion accidents, but the survival rate (17%) is very low. Fifty per cent of pool accidents occur in the family's own backyard pool, and 13.6% in a neighbour's pool; in the latter the survival rate is still low at only 33%. In only one of the 66 cases was there an adequate safety fence; in 76% of cases there was no fence or barrier whatsoever. Tables of swimming pool accidents by age, season, site and outcome are presented. PMID:865357

  10. Pedestrian accident analysis with a silicone dummy block.

    PubMed

    Lee, Youngnae; Park, Sungji; Yoon, Seokhyun; Kong, Youngsu; Goh, Jae-Mo

    2012-07-10

    When a car is parked in an inclined plane in a parking lot, the car can roll down the slope and cause a pedestrian accident, even when the angle of inclination is small. A rolling car on a gentle slope seems to be easily halted by human power to prevent damage to the car or a possible accident. However, even if the car rolls down very slowly, it can cause severe injuries to a pedestrian, especially when the pedestrian cannot avoid the rolling car. In an accident case that happened in our province, a pedestrian was injured by a rolling car, which had been parked on a slope the night before. The accident occurred in the parking lot of an apartment complex. The parking lot seemed almost flat with the naked eye. We conducted a rolling test with the accident vehicle at the site. The car was made to roll down the slope by purely gravitational pull and was made to collide with the silicone block leaning against the retaining wall. Silicone has characteristics similar to those of a human body, especially with respect to stiffness. In the experiment, we measured the shock power quantitatively. The results showed that a rolling car could severely damage the chest of a pedestrian, even if it moved very slowly. PMID:22455985

  11. ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS

    SciTech Connect

    WILLIAMS, J.C.

    2003-11-15

    This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

  12. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    NASA Technical Reports Server (NTRS)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  13. Radiochemical Analysis Methodology for uranium Depletion Measurements

    SciTech Connect

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  14. RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

  15. Improving Network Reliability: Analysis, Methodology, and Algorithms 

    E-print Network

    Booker, Graham B.

    2010-07-14

    multicast network along with a technique that enables wireless clients to efficiently recover lost data sent by their source through collaborative information exchange. Analysis of a network's reliability during a natural disaster can be assessed...

  16. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  17. Global-local methodologies and their application to nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1989-01-01

    An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

  18. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H., III; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  19. Nutrient Analysis Methodology: A Review of the DINE Developmental Literature.

    ERIC Educational Resources Information Center

    Dennison, Darwin; Dennison, Kathryn F.

    1989-01-01

    This review focuses on use of DINE nutrient analysis methodology, within educational settings, where nutritional behavior was the primary outcome variable. Studies are reported which relate to nutrient analysis variance, validity and reliability studies, and pilot and modification studies for use with special populations and situations. (IAH)

  20. Revisiting Methodological Issues in Transcript Analysis: Negotiated Coding and Reliability

    ERIC Educational Resources Information Center

    Garrison, D. R.; Cleveland-Innes, M.; Koole, Marguerite; Kappelman, James

    2006-01-01

    Transcript analysis is an important methodology to study asynchronous online educational discourse. The purpose of this study is to revisit reliability and validity issues associated with transcript analysis. The goal is to provide researchers with guidance in coding transcripts. For validity reasons, it is suggested that the first step is to…

  1. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    SciTech Connect

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask-loading-specific conditions could be performed to demonstrate that release is within the allowable leak rates of the cask.

  2. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    SciTech Connect

    Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

    2012-09-30

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

  3. [Toxicological analysis. Methodology, indication, and evaluation].

    PubMed

    Desel, H

    2013-09-01

    Clinical toxicological analysis can significantly contribute toward the confirmation or exclusion of poisoning, especially if clinical signs and symptoms of unknown origin have to be explained. It may be of help when planning specific, but risky, poisoning therapies. Besides frequently used immunoassays for the detection of drugs of abuse, of a small number of medical drugs, and of amatoxins. Chromatographic methods with mass-selective detectors are available in specialized toxicology laboratories. The results of toxicological analyses have to be evaluated and interpreted carefully. Poison control centers can offer support for all medical aspects of poisoning including lab investigations. PMID:23913112

  4. THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT

    SciTech Connect

    Gupta, N.

    2011-02-14

    Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

  5. NMR methodologies in the analysis of blueberries.

    PubMed

    Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

    2014-06-01

    An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-?-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering. PMID:24668393

  6. Advanced Power Plant Development and Analysis Methodologies

    SciTech Connect

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  7. Methodology A Lagrangian analysis of the impact of transport and

    E-print Network

    Menut, Laurent

    boundary layer (PBL) footprint in the free troposphere. High ozone concentrations are related to pollutedMethodology A Lagrangian analysis of the impact of transport and transformation on the ozone The ozone variability observed by tropospheric ozone lidars during the ESCOMPTE campaign is analyzed

  8. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    ERIC Educational Resources Information Center

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  9. Trial application of the worker safety assessment methodology

    SciTech Connect

    Marchese, A.R.; Neogy, P.

    1995-12-31

    A Worker Safety Assessment Methodology has been developed to assess the risks to workers from radiological accidents at non-reactor nuclear facilities. The methodology utilizes Process Hazards Analysis, proposed risk goals, and Quantitative Risk Analysis. The first phase of a trial application of the methodology to a nuclear facility has been completed and is being reports.

  10. Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

    1996-12-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

  11. Radioactivity analysis following the Fukushima Dai-ichi nuclear accident.

    PubMed

    Tuo, Fei; Xu, Cuihua; Zhang, Jing; Zhou, Qiang; Li, Wenhong; Zhao, Li; Zhang, Qing; Zhang, Jianfeng; Su, Xu

    2013-08-01

    A total of 118 samples were analyzed using HPGe ?-spectrometry. (131)I, (134)Cs, (137)Cs and (136)Cs were detected in aerosol air samples that were collected 22 days after the accident with values of 1720 µBq m(-)³, 247 µBq m(-)³, 289 µBq m(-)³ and 23 µBq m(-)³, respectively. (131)I was detected in rainwater and soil samples and was also measurable in vegetables collected between April 2 and 13, 2011, with values ranging from 0.55 Bq kg(-1) to 2.68 Bq kg(-1). No (131)I was detected in milk, drinking water, seawater or marine biota samples. PMID:23685724

  12. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    USGS Publications Warehouse

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  13. [Fatal skiing accidents: a forensic analysis taking the example of Salzburg].

    PubMed

    Kunz, Sebastian N; Keller, Thomas; Grove, Christina; Lochner, Stefanie; Monticelli, Fabio

    2015-01-01

    The rising popularity of Alpine skiing in recent years has led to an increase of skiing accidents, some with fatal outcome. In this paper, all fatal skiing accidents from the autopsy material of the Institute of Forensic Medicine of the Paris Lodron University Salzburg were evaluated and compared with statistical data of the Alpine Police. In the wintertime of 2005/2006 until 2013/2014, 22 deadly skiing accidents were autopsied. The age of the male and female victims ranged between 12 and 71 years. The main cause of death was craniocerebral and chest trauma. A relevant blood alcohol concentration was detected in only one case. Together with trauma-biomechanical and technical experts, forensic medicine serves as a necessary clarification interface between the investigating authorities and the judiciary. Determining the cause and manner of death as well as reconstructing the accident is the main task of the forensic pathologist. The present study shows that in the county of Salzburg, only a small percentage of fatal skiing accidents is evaluated from a forensic and trauma-biomechanical point of view. Thus the possibilities of an interdisciplinary accident analysis are not always fully utilized. PMID:26419087

  14. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    PubMed Central

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  15. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  16. Disposal criticality analysis methodology for fissile waste forms

    SciTech Connect

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository.

  17. MELCOR code analysis of a severe accident LOCA at Peach Bottom Plant

    SciTech Connect

    Carbajo, J.J. )

    1993-01-01

    A design-basis loss-of-coolant accident (LOCA) concurrent with complete loss of the emergency core cooling systems (ECCSs) has been analyzed for the Peach Bottom atomic station unit 2 using the MELCOR code, version 1.8.1. The purpose of this analysis is to calculate best-estimate times for the important events of this accident sequence and best-estimate source terms. Calculated pressures and temperatures at the beginning of the transient have been compared to results from the Peach Bottom final safety analysis report (FSAR). MELCOR-calculated source terms have been compared to source terms reported in the NUREG-1465 draft.

  18. RELAP5 Application to Accident Analysis of the NIST Research Reactor

    SciTech Connect

    Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

    2012-03-18

    Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

  19. Risk Analysis Methodology for Kistler's K-1 Reusable Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Birkeland, Paul W.

    2002-01-01

    Missile risk analysis methodologies were originally developed in the 1940s as the military experimented with intercontinental ballistic missile (ICBM) technology. As the range of these missiles increased, it became apparent that some means of assessing the risk posed to neighboring populations was necessary to gauge the relative safety of a given test. There were many unknowns at the time, and technology was unpredictable at best. Risk analysis itself was in its infancy. Uncertainties in technology and methodology led to an ongoing bias toward conservative assumptions to adequately bound the problem. This methodology ultimately became the Casualty Expectation Analysis that is used to license Expendable Launch Vehicles (ELVs). A different risk analysis approach was adopted by the commercial aviation industry in the 1950s. At the time, commercial aviation technology was more firmly in hand than ICBM technology. Consequently commercial aviation risk analysis focused more closely on the hardware characteristics. Over the years, this approach has enabled the advantages of technological and safety advances in commercial aviation hardware to manifest themselves in greater capabilities and opportunities. The Boeing 777, for example, received approval for trans-oceanic operations "out of the box," where all previous aircraft were required, at the very least, to demonstrate operations over thousands of hours before being granted such approval. This "out of the box" approval is likely to become standard for all subsequent designs. In short, the commercial aircraft approach to risk analysis created a more flexible environment for industry evolution and growth. In contrast, the continued use of the Casualty Expectation Analysis by the launch industry is likely to hinder industry maturation. It likely will cause any safety and reliability gains incorporated into RLV design to be masked by the conservative assumptions made to "bound the problem." Consequently, for the launch industry to mature, a different approach to RLV risk analysis must be adopted. This paper will present such a methodology for Kistler's K-1 reusable launch vehicle. This paper will develop an approach to risk analysis that represents an amalgamation of the two approaches. This methodology provides flexibility to the launch industry that will enable the regulatory environment to more efficiently accommodate new technologies and approaches. It will also present a derivation of an appropriate assessment threshold that is the equivalent of the currently accepted 30-in-a-million casualty expectation.

  20. A Longitudinal Analysis of the Causal Factors in Major Aviation Accidents in the USA from 1976 to 2006

    E-print Network

    Johnson, Chris

    A Longitudinal Analysis of the Causal Factors in Major Aviation Accidents in the USA from 1976.m.holloway@larc.nasa.gov Abstract This paper forms part of a long term analysis to understand the causes of aviation accidents over time either as a consequence of changes in the aviation industry, such as the introduction of more

  1. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    SciTech Connect

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

  2. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, J. R.

    2002-02-05

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  3. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, James Robert

    2002-05-01

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  4. LOCA and Air Ingress Accident Analysis of a Pebble Bed Reactor

    E-print Network

    1 LOCA and Air Ingress Accident Analysis of a Pebble Bed Reactor by Tieliang Zhai Submitted...................................................................................................... Prof. Jeffrey A. Coderre Chairman, Department Committee on Graduate Students #12;2 LOCA and Air Ingress a sensitivity study to determine how much air would have to be circulated in the reactor cavity to bring

  5. Analysis of dental materials as an aid to identification in aircraft accidents

    SciTech Connect

    Wilson, G.S.; Cruickshanks-Boyd, D.W.

    1982-04-01

    The failure to achieve positive identification of aircrew following an aircraft accident need not prevent a full autopsy and toxicological examination to ascertain possible medical factors involved in the accident. Energy-dispersive electron microprobe analysis provides morphological, qualitative, and accurate quantitative analysis of the composition of dental amalgam. Wet chemical analysis can be used to determine the elemental composition of crowns, bridges and partial dentures. Unfilled resin can be analyzed by infrared spectroscopy. Detailed analysis of filled composite restorative resins has not yet been achieved in the as-set condition to permit discrimination between manufacturers' products. Future work will involve filler studies and pyrolysis of the composite resins by thermogravimetric analysis to determine percentage weight loss when the sample examined is subjected to a controlled heating regime. With these available techniques, corroborative evidence achieved from the scientific study of materials can augment standard forensic dental results to obtain a positive identification.

  6. Analysis of Reactivity Induced Accident for Control Rods Ejection with Loss of Cooling

    E-print Network

    Saad, Hend Mohammed El Sayed; Wahab, Moustafa Aziz Abd El

    2013-01-01

    Understanding of the time-dependent behavior of the neutron population in nuclear reactor in response to either a planned or unplanned change in the reactor conditions, is a great importance to the safe and reliable operation of the reactor. In the present work, the point kinetics equations are solved numerically using stiffness confinement method (SCM). The solution is applied to the kinetics equations in the presence of different types of reactivities and is compared with different analytical solutions. This method is also used to analyze reactivity induced accidents in two reactors. The first reactor is fueled by uranium and the second is fueled by plutonium. This analysis presents the effect of negative temperature feedback with the addition positive reactivity of control rods to overcome the occurrence of control rod ejection accident and damaging of the reactor. Both power and temperature pulse following the reactivity- initiated accidents are calculated. The results are compared with previous works and...

  7. Analysis of Reactivity Induced Accident for Control Rods Ejection with Loss of Cooling

    E-print Network

    Hend Mohammed El Sayed Saad; Hesham Mohammed Mohammed Mansour; Moustafa Aziz Abd El Wahab

    2013-06-05

    Understanding of the time-dependent behavior of the neutron population in nuclear reactor in response to either a planned or unplanned change in the reactor conditions, is a great importance to the safe and reliable operation of the reactor. In the present work, the point kinetics equations are solved numerically using stiffness confinement method (SCM). The solution is applied to the kinetics equations in the presence of different types of reactivities and is compared with different analytical solutions. This method is also used to analyze reactivity induced accidents in two reactors. The first reactor is fueled by uranium and the second is fueled by plutonium. This analysis presents the effect of negative temperature feedback with the addition positive reactivity of control rods to overcome the occurrence of control rod ejection accident and damaging of the reactor. Both power and temperature pulse following the reactivity- initiated accidents are calculated. The results are compared with previous works and satisfactory agreement is found.

  8. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  9. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  10. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  11. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  12. Utilization of accident databases and fuzzy sets to estimate frequency of HazMat transport accidents.

    PubMed

    Qiao, Yuanhua; Keren, Nir; Mannan, M Sam

    2009-08-15

    Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system. PMID:19250750

  13. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary environment consists of the interface between configuration and sizing analysis outputs and aerodynamic parameter computations. Uncertainties are analyzed for both simulation tools and their associated input parameters. Uncertainties are then propagated across the design environment and a robust design optimization is performed over the range of a critical input parameter. The results of this research indicate that including uncertainties into design processes may require modification of design constraints previously considered acceptable in deterministic analyses.

  14. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  15. Development of test methodology for dynamic mechanical analysis instrumentation

    NASA Technical Reports Server (NTRS)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  16. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines

    PubMed Central

    Baka, Aikaterini D.; Uzunoglu, Nikolaos K.

    2014-01-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  17. SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident

    NASA Astrophysics Data System (ADS)

    Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

    2014-06-01

    On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

  18. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  19. 76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-24

    ... Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY: Federal Student Aid, Department of Education. ACTION: Notice of revision of the Federal Need Analysis Methodology for the 2012-2013 award year. Overview Information: . Federal Need Analysis Methodology for the 2012-2013 award year; Federal Pell...

  20. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  1. Defect analysis methodology for contact hole grapho epitaxy DSA

    NASA Astrophysics Data System (ADS)

    Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Kawakami, Shinichiro; Yamauchi, Takashi; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kitano, Takahiro

    2014-04-01

    Next-generation lithography technology is required to meet the needs of advanced design nodes. Directed Self Assembly (DSA) is gaining momentum as an alternative or complementary technology to EUV lithography. We investigate defectivity on a 2xnm patterning of contacts for 25nm or less contact hole assembly by grapho epitaxy DSA technology with guide patterns printed using immersion ArF negative tone development. This paper discusses the development of an analysis methodology for DSA with optical wafer inspection, based on defect source identification, sampling and filtering methods supporting process development efficiency of DSA processes and tools.

  2. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  3. Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.

    PubMed

    Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

    2013-10-01

    Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions as well as the causes of non-compliance with SMS. PMID:23764875

  4. Segment clustering methodology for unsupervised Holter recordings analysis

    NASA Astrophysics Data System (ADS)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

    2015-01-01

    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  5. Framatome-ANP France UO{sub 2} fuel fabrication - criticality safety analysis in the light of the 1999' Tokay Mura accident

    SciTech Connect

    Doucet, M.; Zheng, S.; Mouton, J.; Porte, R.

    2004-07-01

    In France the 1999' Tokai Mura criticality accident in Japan had a big impact on the nuclear fuel manufacturing facility community. Moreover this accident led to a large public discussion about all the nuclear facilities. The French Safety Authorities made strong requirements to the industrials to revisit completely their safety analysis files mainly those concerning nuclear fuels treatments. The Framatome-ANP production of its French low enriched (5 w/o) UO{sub 2} fuel fabrication plant (FBFC/Romans) exceeds 1000 metric tons a year. Special attention was given to the emergency evacuation plan that should be followed in case of a criticality accident. If a criticality accident happens, site internal and external radioprotection requirements need to have an emergency evacuation plan showing the different routes where the absorbed doses will be as low as possible for people. The French Safety Authorities require also an update of the old based neutron source term accounting for state of the art methodology. UO{sub 2} blenders units contain a large amount of dry powder strictly controlled by moderation; a hypothetical water leakage inside one of these apparatus is simulated by increasing the water content of the powder. The resulted reactivity insertion is performed by several static calculations. The French IRSN/CEA CRISTAL codes are used to perform these static calculations. The kinetic criticality code POWDER simulates the power excursion versus time and determines the consequent total energy source term. MNCP4B performs the source term propagation (including neutrons and gamma) used to determine the isodose curves needed to define the emergency evacuation plant. This paper deals with the approach Framatome-ANP has taken to assess Safety Authorities demands using the more up to date calculation tools and methodology. (authors)

  6. Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.

    SciTech Connect

    Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald; Bessette, Gregory Carl; Lipinski, Ronald J.

    2006-09-01

    The Department of Energy has assigned to Sandia National Laboratories the responsibility of producing a Safety Analysis Report (SAR) for the plutonium-dioxide fueled Multi-Mission Radioisotope Thermoelectric Generator (MMRTG) proposed to be used in the Mars Science Laboratory (MSL) mission. The National Aeronautic and Space Administration (NASA) is anticipating a launch in fall of 2009, and the SAR will play a critical role in the launch approval process. As in past safety evaluations of MMRTG missions, a wide range of potential accident conditions differing widely in probability and seventy must be considered, and the resulting risk to the public will be presented in the form of probability distribution functions of health effects in terms of latent cancer fatalities. The basic descriptions of accident cases will be provided by NASA in the MSL SAR Databook for the mission, and on the basis of these descriptions, Sandia will apply a variety of sophisticated computational simulation tools to evaluate the potential release of plutonium dioxide, its transport to human populations, and the consequent health effects. The first step in carrying out this project is to evaluate the existing computational analysis tools (computer codes) for suitability to the analysis and, when appropriate, to identify areas where modifications or improvements are warranted. The overall calculation of health risks can be divided into three levels of analysis. Level A involves detailed simulations of the interactions of the MMRTG or its components with the broad range of insults (e.g., shrapnel, blast waves, fires) posed by the various accident environments. There are a number of candidate codes for this level; they are typically high resolution computational simulation tools that capture details of each type of interaction and that can predict damage and plutonium dioxide release for a range of choices of controlling parameters. Level B utilizes these detailed results to study many thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

  7. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  8. Aspects of uncertainty analysis in accident consequence modeling

    SciTech Connect

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data.

  9. Analysis of the FeCrAl Accident Tolerant Fuel Concept Benefits during BWR Station Blackout Accidents

    SciTech Connect

    Robb, Kevin R

    2015-01-01

    Iron-chromium-aluminum (FeCrAl) alloys are being considered for fuel concepts with enhanced accident tolerance. FeCrAl alloys have very slow oxidation kinetics and good strength at high temperatures. FeCrAl could be used for fuel cladding in light water reactors and/or as channel box material in boiling water reactors (BWRs). To estimate the potential safety gains afforded by the FeCrAl concept, the MELCOR code was used to analyze a range of postulated station blackout severe accident scenarios in a BWR/4 reactor employing FeCrAl. The simulations utilize the most recently known thermophysical properties and oxidation kinetics for FeCrAl. Overall, when compared to the traditional Zircaloy-based cladding and channel box, the FeCrAl concept provides a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. Finally, due to the slower oxidation kinetics, substantially less hydrogen is generated, and the generation is delayed in time. This decreases the amount of non-condensable gases in containment and the potential for deflagrations to inhibit the accident response.

  10. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  11. A first approach to the safety analysis of a tokamak test reactor by a system study methodology

    SciTech Connect

    Boschi, A.; Palma, T.; Sarto, S.; Cambi, G.; Zappellini, G.; Djerassi, H.; Rouillard, J.

    1989-03-01

    The safety analysis and risk assessment of a Tokamak Test Reactor is approached by an iterative, probabilistic, system study methodology, jointly developed by ENEA and CEA. The first part of this methodology consists of a safety related functional analysis of the plant. That is developed in a quite systematic and exhaustive way, aiming at the identification of all the process functions and their modes of loss, so as to forecast all the possible initiating events of safety relevant accident sequences, and their subsequent evolution. This aim is achieved making use of functional interaction and interface matrices, functional fault trees and event trees. The second part concerns the overall plant risk assessment. This is performed using PRA (Probabilistic Risk Assessment) concepts and methods to work out the probabilistic quantification of the system event trees (and linked fault trees), and the evaluation of the related consequences. The methodology is applied by iterations, following the different stages of the plant design development. The first iteration has been applied to the safety analysis of the Vacuum, Tritium and Fuel Handling, Blanket and First Wall and Divertor systems of a Tokamak Test Reactor, with a particular reference to NET.

  12. Development of an engineering methodology for thermal analysis of protected structural members in fire 

    E-print Network

    Liang, Hong; Welch, Stephen

    In order to overcome the limitations of existing methodologies for thermal analysis of protected structural members in fire, a novel CFD-based methodology has been developed. This is a generalised quasi- 3D approach with ...

  13. Review of accident analysis calculations, 232-Z seismic scenario

    SciTech Connect

    Ballinger, M.Y.

    1993-05-01

    The 232-Z Building houses what was previously the incinerator facility, which is no longer in service. It is constructed out of concrete blocks and is approximately 37 ft wide by 57 ft long. The building has a single story over the process areas and two stories over the service areas at the north end of the building. The respective roofs are 15 ft and 19 ft above grade and consist of concrete over a metal decking, with insulation and a built-up asphalt gravel covering. This facility is assumed to collapse in the seismic event evaluated in the safety analyses, resulting in the release of a portion of the residual plutonium inventory remaining in the building. The seismic scenario for 232-Z assumes that the block concrete walls collapse, allowing the roof to fall, crushing the contaminated duct and gloveboxes within. This paper is a review of the scenario and methods used to calculate the source term from the seismic event as presented in the Plutonium Finishing Plant Final Safety Analysis Report (WHC 1991) also referred to as the PFP FSAR. Alternate methods of estimating the source term are presented. The calculation of source terms based on the mechanisms of release expected in worst-case scenario is recommended.

  14. A New Methodology of Spatial Cross-Correlation Analysis

    PubMed Central

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  15. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  16. Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents

    NASA Astrophysics Data System (ADS)

    Minato, Kazuo

    1991-11-01

    In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

  17. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  18. Preliminary analysis of graphite dust releasing behavior in accident for HTR

    SciTech Connect

    Peng, W.; Yang, X. Y.; Yu, S. Y.; Wang, J.

    2012-07-01

    The behavior of the graphite dust is important to the safety of High Temperature Gas-cooled Reactors. This study investigated the flow of graphite dust in helium mainstream. The analysis of the stresses acting on the graphite dust indicated that gas drag played the absolute leading role. Based on the understanding of the importance of gas drag, an experimental system is set up for the research of dust releasing behavior in accident. Air driven by centrifugal fan is used as the working fluid instead of helium because helium is expensive, easy to leak which make it difficult to seal. The graphite particles, with the size distribution same as in HTR, are added to the experiment loop. The graphite dust releasing behavior at the loss-of-coolant accident will be investigated by a sonic nozzle. (authors)

  19. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  20. Analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.

    PubMed

    Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

    2015-05-01

    The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

  1. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)

    SciTech Connect

    Whitehead, D.; Darby, J.; Yakle, J.

    1994-06-01

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

  2. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    NASA Astrophysics Data System (ADS)

    Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

    2001-05-01

    Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  3. Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident

    SciTech Connect

    Aldrich, D.C.; Blond, R.M.

    1980-01-01

    An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

  4. ADHD and relative risk of accidents in road traffic: a meta-analysis.

    PubMed

    Vaa, Truls

    2014-01-01

    The present meta-analysis is based on 16 studies comprising 32 results. These studies provide sufficient data to estimate relative accident risks of drivers with ADHD. The overall estimate of relative risk for drivers with ADHD is 1.36 (95% CI: 1.18; 1.57) without control for exposure, 1.29 (1.12; 1.49) when correcting for publication bias, and 1.23 (1.04; 1.46) when controlling for exposure. A relative risk (RR) of 1.23 is exactly the same as found for drivers with cardiovascular diseases. The long-lasting assertion that "ADHD-drivers have an almost fourfold risk of accident compared to non-ADHD-drivers", which originated from Barkley et al.'s study of 1993, is rebutted. That estimate was associated with comorbid Oppositional Defiant Disorder (ODD) and/or Conduct Disorder (CD), not with ADHD, but the assertion has incorrectly been maintained for two decades. The present study provides some support for the hypothesis that the relative accident risk of ADHD-drivers with comorbid ODD, CD and/or other conduct problems, is higher than that of ADHD-drivers without these comorbidities. The estimated RRs were 1.86 (1.27; 2.75) in a sample of ADHD-drivers in which a majority had comorbid ODD and/or CD compared to 1.31 (0.96; 1.81) in a sample of ADHD-drivers with no comorbidity. Given that ADHD-drivers most often seem to drive more than controls, and the fact that a majority of the present studies lack information about exposure, it seems more probable that the true RR is lower rather than higher than 1.23. Also the assertion that ADHD-drivers violate traffic laws more often than other drivers should be modified: ADHD-drivers do have more speeding violations, but no more drunk or reckless driving citations than drivers without ADHD. All accident studies included in the meta-analysis fail to acknowledge the distinction between deliberate violations and driving errors. The former are known to be associated with accidents, the latter are not. A hypothesis that ADHD-drivers speed more frequently than controls because it stimulates attention and reaction time is suggested. PMID:24238842

  5. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    SciTech Connect

    Johnson, E.W.

    1985-10-01

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  6. Analysis of dose distribution for heavily exposed workers in the first criticality accident in Japan.

    PubMed

    Endo, Akira; Yamaguchi, Yasuhiro

    2003-04-01

    The first criticality accident in Japan occurred in a uranium processing plant in Tokai-mura on September 30, 1999. The accident, which occurred while a large amount of enriched uranyl nitrate solution was being loaded into a tank, led to a chain reaction that continued for 20 h. Two workers who were pouring the uranium solution into the tank at the time were heterogeneously exposed to neutrons and gamma rays produced by nuclear fission. Analysis of dose distributions was essential for the understanding of the clinical course observed in the skin and organs of these workers. We developed a numerical simulation system, which consists of mathematical human models and Monte Carlo radiation transport programs, for analyzing dose distributions in various postures and applied the system to the dose analysis for the two workers. This analysis revealed the extreme heterogeneity of the doses from neutrons and gamma rays in the skin and body, which depended on the positions and postures of the workers. The detailed dose analysis presented here using color maps is indispensable for an understanding of the biological effects of high-dose exposure to a mixed field of neutrons and gamma rays as well as for the development of emergency treatments for victims of radiation exposure. PMID:12643798

  7. The Analysis of PWR SBO Accident with RELAP5 Based on Linux

    NASA Astrophysics Data System (ADS)

    Xia, Zhimin; Zhang, Dafa

    RELAP5 is a relatively advanced light water reactor transient hydraulic and thermal analysis code, and it owns the signality of the safe-operating of nuclear reactor system when the safety analysis and operating simulation of the system was done with RELAP5. The RELAP5 operating mode based on Linux operating system was presented in this paper, utilizing Linux operating system's powerful document processing capabilities to deal with the output file of the RELAP5 for the valid data directly, and taking advantage of the system's programmable capabilities to improve the drawing functions of RELAP5. After the operating in Linux system, the precision of the calculating results is guaranteed and the period of the computing is shortened. During the work, for PWR Station Blackout (SBO) accident, the computing with RELAP5 based on Linux and Windows was respectively made. Through the comparison and analysis of the accident response curve of the main parameters such as power of nuclear reactor, average temperature and pressure of primary loop, it shows the operating analysis of nuclear reactor system is safe and reliable with RELAP5 based on Linux.

  8. Methods for Detector Placement and Analysis of Criticality Accident Alarm Systems

    SciTech Connect

    Peplow, Douglas E.; Wetzel, Larry

    2012-01-01

    Determining the optimum placement to minimize the number of detectors for a criticality accident alarm system (CAAS) in a large manufacturing facility is a complex problem. There is typically a target for the number of detectors that can be used over a given zone of the facility. A study to optimize detector placement typically begins with some initial guess at the placement of the detectors and is followed by either predictive calculations of accidents at specific locations or adjoint calculations based on preferred detector locations. Within an area of a facility, there may be a large number of potential criticality accident sites. For any given placement of the detectors, the list of accident sites can be reduced to a smaller number of locations at which accidents may be difficult for detectors to detect. Developing the initial detector placement and determining the list of difficult accident locations are both based on the practitioner's experience. Simulations following fission particles released from an accident location are called 'forward calculations.' These calculations can be used to answer the question 'where would an alarm be triggered?' by an accident at a specified location. Conversely, 'adjoint calculations' start at a detector site using the detector response function as a source and essentially run in reverse. These calculations can be used to answer the question 'where would an accident be detected?' by a specified detector location. If the number of accidents, P, is much less than the number of detectors, Q, then forward simulations may be more convenient and less time-consuming. If Q is large or the detectors are not placed yet, then a mesh tally of dose observed by a detector at any location must be computed over the entire zone. If Q is much less than P, then adjoint calculations may be more efficient. Adjoint calculations employing a mesh tally can be even more advantageous because they do not rely on a list of specific difficult-to-detect accident sites, which may not have included every possible accident location. Analog calculations (no biasing) simply follow particles naturally. For sparse buildings and line-of-sight calculations, analog Monte Carlo (MC) may be adequate. For buildings with internal walls or large amounts of heavy equipment (dense geometry), variance reduction may be required. Calculations employing the CADIS method use a deterministic calculation to create an importance map and a matching biased source distribution that optimize the final MC to quickly calculate one specific tally. Calculations employing the FW-CADIS method use two deterministic calculations (one forward and one adjoint) to create an importance map and a matching biased source distribution that are designed to make the MC calculate a mesh tally with more uniform uncertainties in both high-dose and low-dose areas. Depending on the geometry of the problem, the number of detectors, and the number of accident sites, different approaches to CAAS placement studies can be taken. These are summarized in Table I. SCALE 6.1 contains the MAVRIC sequence, which can be used to perform any of the forward-based approaches outlined in Table I. For analog calculations, MAVRIC simply calls the Monaco MC code. For CADIS and FW-CADIS, MAVRIC uses the Denovo discrete ordinates (SN) deterministic code to generate the importance map and biased source used by Monaco. An adjoint capability is currently being added to Monaco and should be available in the next release of SCALE. An adjoint-based approach could be performed with Denovo alone - although fine meshes, large amounts of memory, and long computation times may be required to obtain accurate solutions. Coarse-mesh SN simulations could be employed for adjoint-based scoping studies until the adjoint capability in Monaco is complete. CAAS placement studies, especially those dealing with mesh tallies, require some extra utilities to aid in the analysis. Detectors must receive a minimum dose rate in order to alarm; therefore, a simple yes/no plot could be more useful to the analyst t

  9. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  10. Hypothetical accident condition thermal analysis and testing of a Type B drum package

    SciTech Connect

    Hensel, S.J.; Alstine, M.N. Van; Gromada, R.J.

    1995-07-01

    A thermophysical property model developed to analytically determine the thermal response of cane fiberboard when exposed to temperatures and heat fluxes associated with the 10 CFR 71 hypothetical accident condition (HAC) has been benchmarked against two Type B drum package fire test results. The model 9973 package was fire tested after a 30 ft. top down drop and puncture, and an undamaged model 9975 package containing a heater (21W) was fire tested to determine content heat source effects. Analysis results using a refined version of a previously developed HAC fiberboard model compared well against the test data from both the 9973 and 9975 packages.

  11. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

  12. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis

  13. The methodology of multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

  14. Vehicle-mounted mine detection: test methodology, application, and analysis

    NASA Astrophysics Data System (ADS)

    Hanshaw, Terilee

    1998-09-01

    The Mine/Minefield detection community's maturing technology base has become a developmental resource for world wide military and humanitarian applications. During the last decade, this community has developed a variety of single and multi-sensor applications incorporating a diversity of sensor and processor technologies. These diverse developments from the Mine/Minefield detection community require appropriate metrics to objectively bound technology and to define applicability to expected military and humanitarian applications. This paper presents a survey of the test methodology, application and analysis activities conducted by the U.S. Army Communications and Electronics Command's, Night Vision and Electronic Sensors Directorate (NVESD) on behalf of the Mine/Minefield detection community. As needs of world wide military and humanitarian mine detection activities are being responded to by notable technology base advances, a diverse pool of knowledge has been developed. The maturity of these technology base advances must be evaluated in a more systematic method. As these technologies mature, metrics have been developed to support the development process and to define the applicability of these technology base advances. The author will review the diversity of the mine detection technology and their related testing strategies. Consideration is given to the impact of history and global realism on the U.S. Army's present mine detection testing program. Further, definitions of testing metrics and analysis will be reviewed. Finally the paper will outline future U.S. Army testing plans with a special consideration given to the Vehicular Mounted Mine Detection/Ground Standoff Mine Detection System (VMMD/GSTAMIDS) Advanced Technology Demonstration and related issues.

  15. Natural phenomena risk analysis - an approach for the tritium facilities 5480.23 SAR natural phenomena hazards accident analysis

    SciTech Connect

    Cappucci, A.J. Jr.; Joshi, J.R.; Long, T.A.; Taylor, R.P.

    1997-07-01

    A Tritium Facilities (TF) Safety Analysis Report (SAR) has been developed which is compliant with DOE Order 5480.23. The 5480.23 SAR upgrades and integrates the safety documentation for the TF into a single SAR for all of the tritium processing buildings. As part of the TF SAR effort, natural phenomena hazards (NPH) were analyzed. A cost effective strategy was developed using a team approach to take advantage of limited resources and budgets. During development of the Hazard and Accident Analysis for the 5480.23 SAR, a strategy was required to allow maximum use of existing analysis and to develop a cost effective graded approach for any new analysis in identifying and analyzing the bounding accidents for the TF. This approach was used to effectively identify and analyze NPH for the TF. The first part of the strategy consisted of evaluating the current SAR for the RTF to determine what NPH analysis could be used in the new combined 5480.23 SAR. The second part was to develop a method for identifying and analyzing NPH events for the older facilities which took advantage of engineering judgment, was cost effective, and followed a graded approach. The second part was especially challenging because of the lack of documented existing analysis considered adequate for the 5480.23 SAR and a limited budget for SAR development and preparation. This paper addresses the strategy for the older facilities.

  16. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. PMID:21094332

  17. Bayesian data analysis of severe fatal accident risk in the oil chain.

    PubMed

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. PMID:22642363

  18. A methodology for the analysis of medical data by A. Tsanas1,2,*

    E-print Network

    Tarrès, Pierre

    of statistical exploration. Introduction and terminology Imagine a subject going to the clinic for a medical1 A methodology for the analysis of medical data by A. Tsanas1,2,* , M.A. Little2,3 , P.E. Mc a methodology for the quantitative analysis of certain kinds of medical data. It is mainly aimed at clinical

  19. Spectrally resolved bioluminescence tomography with adaptive finite element analysis: methodology and simulation

    E-print Network

    Wang, Ge

    Spectrally resolved bioluminescence tomography with adaptive finite element analysis: methodology bioluminescence tomography with adaptive finite element analysis: methodology and simulation Yujie Lv1 , Jie Tian1.iop.org/PMB/52/4497 Abstract As a molecular imaging technique, bioluminescence tomography (BLT) with its highly

  20. A Content Analysis of News Media Coverage of the Accident at Three Mile Island.

    ERIC Educational Resources Information Center

    Stephens, Mitchell; Edison, Nadyne G.

    A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

  1. Analysis of Occupational Accident Fatalities and Injuries Among Male Group in Iran Between 2008 and 2012

    PubMed Central

    Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi

    2015-01-01

    Background: Because of occupational accidents, permanent disabilities and deaths occur and economic and workday losses emerge. Objectives: The purpose of the present study was to investigate the factors responsible for occupational accidents occurred in Iran. Patients and Methods: The current study analyzed 1464 occupational accidents recorded by the Ministry of Labor and Social Affairs’ offices in Iran during 2008 - 2012. At first, general understanding of accidents was obtained using descriptive statistics. Afterwards, the chi-square test and Cramer’s V statistic (Vc) were used to determine the association between factors influencing the type of injury as occupational accident outcomes. Results: There was no significant association between marital status and time of day with the type of injury. However, activity sector, cause of accident, victim’s education, age of victim and victim’s experience were significantly associated with the type of injury. Conclusions: Successful accident prevention relies largely on knowledge about the causes of accidents. In any accident control activity, particularly in occupational accidents, correctly identifying high-risk groups and factors influencing accidents is the key to successful interventions. Results of this study can cause to increase accident awareness and enable workplace’s management to select and prioritize problem areas and safety system weakness in workplaces. PMID:26568848

  2. Analysis of station blackout accidents for the Bellefonte pressurized water reactor

    SciTech Connect

    Gasser, R D; Bieniarz, P P; Tills, J L

    1986-09-01

    An analysis has been performed for the Bellefonte PWR Unit 1 to determine the containment loading and the radiological releases into the environment from a station blackout accident. A number of issues have been addressed in this analysis which include the effects of direct heating on containment loading, and the effects of fission product heating and natural convection on releases from the primary system. The results indicate that direct heating which involves more than about 50% of the core can fail the Bellefonte containment, but natural convection in the RCS may lead to overheating and failure of the primary system piping before core slump, thus, eliminating or mitigating direct heating. Releases from the primary system are significantly increased before vessel breach due to natural circulation and after vessel breach due to reevolution of retained fission products by fission product heating of RCS structures.

  3. Analysis of the SL-1 Accident Using RELAPS5-3D

    SciTech Connect

    Francisco, A.D. and Tomlinson, E. T.

    2007-11-08

    On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

  4. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  5. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  6. The environmental accident at 'Schweizerhalle' and respiratory diseases in children: a time series analysis.

    PubMed

    Helfenstein, U; Ackermann-Liebrich, U; Braun-Fahrländer, C; Wanner, H U

    1991-10-01

    During an investigation concerned with the relationship between air pollution and respiratory diseases in children, the 'Schweizerhalle' accident occurred when unknown amounts of pollutants were discharged into the environment. In that investigation, two series of medical data were collected during one year: (a) The daily relative number of preschool children, exhibiting diseases of the respiratory tract, who either came to the outpatients' clinic of the Children's Hospital or were reported by paediatricians in Basle; (b) The daily number of respiratory symptoms per child, observed in a group of randomly selected preschool children. The purpose of the present time series analysis is the assessment of possible change in these series after the environmental accident. The nature of the change is studied by complementary approaches. First, a forecast arising from models identified in the preaccident period is compared with the actual data. Thereafter, intervention models which adequately and parsimoniously represent the change are identified. Finally, an identification of a change-point is performed. PMID:1947506

  7. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  8. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents

    E-print Network

    Wheatley, Spencer; Sornette, Didier

    2015-01-01

    We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...

  10. DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS

    SciTech Connect

    Wu, T

    2008-04-30

    Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

  11. Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report

    SciTech Connect

    Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

    1986-09-01

    The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

  12. Methodologies for analysis of patterning in the mouse RPE sheet

    PubMed Central

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer-analyzed results were compared. Whether tallied manually or automatically with software, the resulting cell measurements were in close agreement. We compared normal with diseased RPE cells during aging with quantitative cell size and shape metrics. Subtle differences between the RPE sheet characteristics of young and old mice were identified. The IRBP?/? mouse RPE sheet did not differ from C57BL/6J (wild type, WT), suggesting that IRBP does not play a direct role in maintaining the health of the RPE cell, while the slow loss of photoreceptor (PhR) cells previously established in this knockout does support a role in the maintenance of PhR cells. Rd8 mice exhibited several measurable changes in patterns of RPE cells compared to WT, suggesting a slow degeneration of the RPE sheet that had not been previously noticed in rd8. Conclusions An optimized dissection method and a series of programs were used to establish a rapid and hands-off analysis. The software-aided, high-sampling-size approach performed as well as trained human scorers, but was considerably faster and easier. This method allows tens to hundreds of thousands of cells to be analyzed, each with 23 metrics. With this combination of dissection and image analysis of the RPE sheet, we can now analyze cell-to-cell interactions of immediate neighbors. In the future, we may be able to observe interactions of second, third, or higher ring neighbors and analyze tension in sheets, which might be expected to deviate from normal near large bumps in the RPE sheet caused by druse or when large frank holes in the RPE sheet are observed in geographic atrophy. This method and software can be readily applied to other aspects of vision science, neuroscience, and epithelial biology where patterns may exist in a sheet or surface of cells. PMID:25593512

  13. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    SciTech Connect

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.

  14. Cost-Utility Analysis: Current Methodological Issues and Future Perspectives

    PubMed Central

    Nuijten, Mark J. C.; Dubois, Dominique J.

    2011-01-01

    The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127

  15. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  16. Accident safety analysis for 300 Area N Reactor Fuel Fabrication and Storage Facility

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-01

    The purpose of the accident safety analysis is to identify and analyze a range of credible events, their cause and consequences, and to provide technical justification for the conclusion that uranium billets, fuel assemblies, uranium scrap, and chips and fines drums can be safely stored in the 300 Area N Reactor Fuel Fabrication and Storage Facility, the contaminated equipment, High-Efficiency Air Particulate filters, ductwork, stacks, sewers and sumps can be cleaned (decontaminated) and/or removed, the new concretion process in the 304 Building will be able to operate, without undue risk to the public, employees, or the environment, and limited fuel handling and packaging associated with removal of stored uranium is acceptable.

  17. Analysis of Radionuclide Releases from the Fukushima Dai-Ichi Nuclear Power Plant Accident Part I

    NASA Astrophysics Data System (ADS)

    Le Petit, G.; Douysset, G.; Ducros, G.; Gross, P.; Achim, P.; Monfort, M.; Raymond, P.; Pontillon, Y.; Jutier, C.; Blanchard, X.; Taffary, T.; Moulin, C.

    2014-03-01

    Part I of this publication deals with the analysis of fission product releases consecutive to the Fukushima Dai-ichi accident. Reactor core damages are assessed relying on radionuclide detections performed by the CTBTO radionuclide network, especially at the particulate station located at Takasaki, 210 km away from the nuclear power plant. On the basis of a comparison between the reactor core inventory at the time of reactor shutdowns and the fission product activities measured in air at Takasaki, especially 95Nb and 103Ru, it was possible to show that the reactor cores were exposed to high temperature for a prolonged time. This diagnosis was confirmed by the presence of 113Sn in air at Takasaki. The 133Xe assessed release at the time of reactor shutdown (8 × 1018 Bq) turned out to be in the order of 80 % of the amount deduced from the reactor core inventories. This strongly suggests a broad meltdown of reactor cores.

  18. Modeling & analysis of core debris recriticality during hypothetical severe accidents in the Advanced Neutron Source Reactor

    SciTech Connect

    Kim, S.H.; Georgevich, V.; Simpson, D.B.; Slater, C.O.; Taleyarkhan, R.P.

    1992-10-01

    This paper discusses salient aspects of severe-accident-related recriticality modeling and analysis in the Advanced Neutron Source (ANS) reactor. The development of an analytical capability using the KEN05A-SCALE system is described including evaluation of suitable nuclear cross-section sets to account for the effects of system geometry, mixture temperature, material dispersion and other thermal-hydraulic conditions. Benchmarking and validation efforts conducted with KEN05-SCALE and other neutronic codes against critical experiment data are described. Potential deviations and biases resulting from use of the 16-group Hansen-Roach library are shown. A comprehensive test matrix of calculations to evaluate the threat of a criticality event in the ANS is described. Strong dependencies on geometry, material constituents, and thermal-hydraulic conditions are described. The introduction of designed mitigative features are described.

  19. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  20. IEEE 802.11AND BLUETOOTH COEXISTENCE ANALYSIS METHODOLOGY'

    E-print Network

    Howitt, Ivan

    wireless personal area networks and IEEE 802.11 wireless local area networks share the same 2.4 GHz UL band an importantissue. Both BT wireless personal area networks (WPANs) [1, 21 and IEEE 802.11 wireless local area standards committee [4, 51. In this paper, a more general analytical approach is presented. A methodology

  1. Analysis of previous research work in Models and Methodologies in

    E-print Network

    Lano, Kevin Charles

    by Loudhouse Research, on behalf of CA in 2007 #12;PM standards · Project Management Body of Knowledge (PMBOK) by PMI, USA · PRINCE-2 by APMG, UK. · PMBOK is more dominant standard as this is used in more than 75 Model, Scrum, Extreme Programming, etc. and management methodologies such as PMBOK, PRINCE2. #12;Project

  2. Accident management information needs

    SciTech Connect

    Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. )

    1990-04-01

    In support of the US Nuclear Regulatory Commission (NRC) Accident Management Research Program, a methodology has been developed for identifying the plant information needs necessary for personnel involved in the management of an accident to diagnose that an accident is in progress, select and implement strategies to prevent or mitigate the accident, and monitor the effectiveness of these strategies. This report describes the methodology and presents an application of this methodology to a Pressurized Water Reactor (PWR) with a large dry containment. A risk-important severe accident sequence for a PWR is used to examine the capability of the existing measurements to supply the necessary information. The method includes an assessment of the effects of the sequence on the measurement availability including the effects of environmental conditions. The information needs and capabilities identified using this approach are also intended to form the basis for more comprehensive information needs assessment performed during the analyses and development of specific strategies for use in accident management prevention and mitigation. 3 refs., 16 figs., 7 tabs.

  3. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    SciTech Connect

    Not Available

    1988-12-15

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

  4. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  5. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  6. Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H

    SciTech Connect

    Blanchard, A.

    1999-05-10

    'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

  7. The ESA/NASA SOHO Mission Interruption: Using the STAMP Accident Analysis Technique for a Software Related `Mishap'

    E-print Network

    Johnson, Chris

    -1- The ESA/NASA SOHO Mission Interruption: Using the STAMP Accident Analysis Technique, University of Glasgow, Scotland. johnson@dcs.gla.ac.uk (b) NASA Langley Research Center, MS 130 / 100 NASA Road, Hampton, VA 23681-2199, USA c.m.holloway@larc.nasa.gov Abstract: Mishap investigations provide

  8. Lin et al TRB 14-2181 Causal Analysis of Passenger Train Accident on Shared-Use Rail Corridors1

    E-print Network

    Barkan, Christopher P.L.

    Lin et al TRB 14-2181 Causal Analysis of Passenger Train Accident on Shared-Use Rail Corridors1 2 Rapik Saat, and Christopher P. L. Barkan13 14 Rail Transportation and Engineering Center15 Department passenger rail service in the U.S. will involve use of4 existing railroad infrastructure or rights of way

  9. Analysis of the crush environment for lightweight air-transportable accident-resistant containers

    SciTech Connect

    McClure, J.D.; Hartman, W.F.

    1981-12-01

    This report describes the longitudinal dynamic crush environment for a Lightweight Air-Transportable Accident-Resistant Container (LAARC, now called PAT-2) that can be used to transport small quantities of radioactive material. The analysis of the crush environment involves evaluation of the forces imposed upon the LAARC package during the crash of a large, heavily loaded, cargo aircraft. To perform the analysis, a cargo load column was defined which consisted of a longitudinal prism of cargo of cross-sectional area equal to the projected area of the radioactive-material package and length equal to the longitudinal extent of the cargo compartment in a commercial cargo jet aircraft. To bound the problem, two analyses of the cargo load column were performed, a static stability analysis and a dynamic analysis. The results of these analyses can be applied to other packaging designs and suggest that the physical limits or magnitude of the longitudinal crush forces, which are controlled in part by the yield strength of the cargo and the package size, are much smaller than previously estimated.

  10. Analysis of traffic accident size for Korean highway using structural equation models.

    PubMed

    Lee, Ju-Yeon; Chung, Jin-Hyuk; Son, Bongsoo

    2008-11-01

    Accident size can be expressed as the number of involved vehicles, the number of damaged vehicles, the number of deaths and/or the number of injured. Accident size is the one of the important indices to measure the level of safety of transportation facilities. Factors such as road geometric condition, driver characteristic and vehicle type may be related to traffic accident size. However, all these factors interact in complicate ways so that the interrelationships among the variables are not easily identified. A structural equation model is adopted to capture the complex relationships among variables because the model can handle complex relationships among endogenous and exogenous variables simultaneously and furthermore it can include latent variables in the model. In this study, we use 2649 accident data occurred on highways in Korea and estimate relationship among exogenous factors and traffic accident size. The model suggests that road factors, driver factors and environment factors are strongly related to the accident size. PMID:19068300

  11. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  12. A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains

    SciTech Connect

    Burgherr, P.; Hirschberg, S.

    2008-07-01

    This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

  13. Full-Envelope Launch Abort System Performance Analysis Methodology

    NASA Technical Reports Server (NTRS)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  14. Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

  15. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  16. 78 FR 29353 - Federal Need Analysis Methodology for the 2014-15 Award Year-Federal Pell Grant, Federal Perkins...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15...Federal Perkins Loan, Federal Work-Study, Federal Supplemental...the statutory Federal Need Analysis Methodology that determines...Department uses in the Federal Need Analysis Methodology to determine...

  17. Development and validation of a generalised engineering methodology for thermal analysis of structural members in fire 

    E-print Network

    Liang, Hong; Welch, Stephen; Stratford, Tim J; Kinsella, Emmett V

    A novel methodology for generalising CFD-based approaches for thermal analysis of protected steelwork in fire has been developed, known as GeniSTELA. This is a quasi-3D approach with computation of a "steel temperature ...

  18. Building Energy Performance Analysis of an Academic Building Using IFC BIM-Based Methodology 

    E-print Network

    Aziz, Z.; Arayici, Y.; Shivachev, D.

    2012-01-01

    This paper discusses the potential to use an Industry Foundation Classes (IFC)/Building Information Modelling (BIM) based method to undertake Building Energy Performance analysis of an academic building. BIM/IFC based methodology provides a...

  19. A systems-based methodology for structural analysis of health care operations.

    PubMed

    Keating, C B

    2000-01-01

    This paper introduces a systems-based methodology for conducting analysis of organizational structure for health care operations. Increasingly, health care organizations must operate in turbulent environments characterized by rapid change, high levels of uncertainty, and increasing levels of complexity. A fundamental issue for effective-performance in these environments is the development and maintenance of organizational structures that simultaneously provide both operational stability and agile response to environmental turbulence. Drawing from systems science, a systems-based methodology for structural analysis of healthcare operations is developed. This methodology identifies operational deficiencies stemming from inadequate organizational structure and suggests focal areas for structural modification. The results from an application of the methodology in a health care organization are examined. Implications and limitations for use of the methodology by health care professionals are provided. PMID:11142060

  20. Assessment of ISLOCA risk-methodology and application to a combustion engineering plant

    SciTech Connect

    Kelly, D.L.; Auflick, J.L.; Haney, L.N.

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISOLOCA core damage frequency and risk. This report presents a detailed of description of the application of this analysis methodology to a Combustion Engineering plant.

  1. Integrated analysis methodology for reassessment and maintenance of offshore structures

    SciTech Connect

    Faber, M.H.; Dharmavasan, S.; Dijkstra, O.D.

    1994-12-31

    In recent years, the application of modern reliability methods within the framework of classical decision theory has been investigated for applications in inspection and maintenance planning of engineering systems subject to uncertain deterioration processes. This has led to a consistent framework for evaluation of the consequences (expected costs) of different inspection and repair actions thus allowing for an optimization of the overall inspection and maintenance plan for a given structure within its anticipated lifetime. This paper describes the development of such a reliability based inspection, repair and maintenance (IRM) planning methodology for fixed offshore platforms.

  2. Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

  3. Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.

    2004-01-01

    A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

  4. The role of mitochondrial proteomic analysis in radiological accidents and terrorism.

    PubMed

    Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

    2013-01-01

    In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring. PMID:22879026

  5. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  6. Transient analysis for thermal margin with COASISO during a severe accident

    SciTech Connect

    Kim, Chan S.; Chu, Ho S.; Suh, Kune Y.; Park, Goon C.; Lee, Un C.; Yoon, Ho J.

    2002-07-01

    As an IVR-EVC (in-vessel retention through external vessel cooling) design concept, external cooling of the reactor vessel was suggested to protect the lower head from being overheated due to relocated material from the core during a severe accident. The COASISO (Corium Attack Syndrome Immunization Structure Outside the vessel) adopts an external vessel cooling strategy of flooding the reactor vessel inside the thermal insulator. Its advantage is the quick response time so that the initial heat removal mechanism of the EVC is nucleate boiling from the downward-facing lower head. The efficiency of the COASISO may be estimated by the thermal margin defined as the ratio of the actual heat flux from the reactor vessel to the critical heat flux (CHF). In this study the thermal margin for the large power reactor as the APR1400 (Advanced Power Reactor 1400 MWe) was determined by means of transient analysis for the local condition of the coolant and temperature distributions within the reactor vessel. The heat split fraction in the oxide pool and the metal layer focusing effect were considered during calculation of the angular thermal load at the inner wall of the lower head. The temperature distributions in the reactor vessel resulted in the actual heat flux on the outer wall. The local quality was obtained by solving the simplified transient energy equation. The unheated section of the reactor vessel decreases the thermal margin by mean of the two-dimensional conduction heat transfer. The peak temperature of the reactor vessel was estimated in the film boiling region as the thermal margin was equal to unity. Sensitivity analyses were performed for the time of corium relocation after the reactor trip, the coolant flow rate, and the initial subcooled condition of the coolant. There is no vessel failure predicted at the worst EVC condition when the stratification is not taken into account between the metal layer and the oxidic pool. The present predictive tool may be implemented in the severe accident analysis code like MAAP4 for the external vessel cooling with the COASISO. (authors)

  7. Interpretation methodology and analysis of in-flight lightning data

    NASA Technical Reports Server (NTRS)

    Rudolph, T.; Perala, R. A.

    1982-01-01

    A methodology is presented whereby electromagnetic measurements of inflight lightning stroke data can be understood and extended to other aircraft. Recent measurements made on the NASA F106B aircraft indicate that sophisticated numerical techniques and new developments in corona modeling are required to fully understand the data. Thus the problem is nontrivial and successful interpretation can lead to a significant understanding of the lightning/aircraft interaction event. This is of particular importance because of the problem of lightning induced transient upset of new technology low level microcircuitry which is being used in increasing quantities in modern and future avionics. Inflight lightning data is analyzed and lightning environments incident upon the F106B are determined.

  8. The XMM Cluster Survey: X-ray analysis methodology

    NASA Astrophysics Data System (ADS)

    Lloyd-Davies, E. J.; Romer, A. Kathy; Mehrtens, Nicola; Hosmer, Mark; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G.; Hilton, Matt; Liddle, Andrew R.; Viana, Pedro T. P.; Campbell, Heather C.; Collins, Chris A.; Dubois, E. Naomi; Freeman, Peter; Harrison, Craig D.; Hoyle, Ben; Kay, Scott T.; Kuwertz, Emma; Miller, Christopher J.; Nichol, Robert C.; Sahlén, Martin; Stanford, S. A.; Stott, John P.

    2011-11-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM-Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3675 > 4? cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg2. Of these, 993 candidates are detected with >300 background-subtracted X-ray photon counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these candidates, as well as to estimate redshifts from the X-ray data alone. A total of 587 (122) X-ray temperatures to a typical accuracy of <40 (<10) per cent have been measured to date. We also present the methodology adopted for determining the selection function of the survey, and show that the extended source detection algorithm is robust to a range of cluster morphologies by inserting mock clusters derived from hydrodynamical simulations into real XMMimages. These tests show that the simple isothermal ?-profiles is sufficient to capture the essential details of the cluster population detected in the archival XMM observations. The redshift follow-up of the XCS cluster sample is presented in a companion paper, together with a first data release of 503 optically confirmed clusters.

  9. Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR

    SciTech Connect

    Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D.; Struwe, D.; Pfrang, W.; Ponomarev, A.

    2012-07-01

    In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed future analyses, particularly concerning: the neutronic burn-up scheme, the modeling of the diagrid effect and the control rod drive line expansion feed-backs, as well as the primary/secondary systems thermal-hydraulics behavior. (authors)

  10. Fault Tree Analysis: An Emerging Methodology for Instructional Science.

    ERIC Educational Resources Information Center

    Wood, R. Kent; And Others

    1979-01-01

    Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)

  11. Personnel protection means. Part 2: Methodology for safety analysis

    NASA Astrophysics Data System (ADS)

    A method for the analysis of safety risks and for the choice of adequate safety means in an industrial environment is proposed. An analysis worksheet is presented in which parts of the human body and the risk factors are cross-related. Another worksheet for the evaluation of the efficiency of the proposed safety means is also described.

  12. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  13. The first steps towards a standardized methodology for CSP electricity yield analysis.

    SciTech Connect

    Wagner, Michael; Hirsch, Tobias , Institute of Technical Thermodynamics, Stuttgart,Germany); Benitez, Daniel; Eck, Markus , Institute of Technical Thermodynamics, Stuttgart,Germany); Ho, Clifford Kuofei

    2010-08-01

    The authors have founded a temporary international core team to prepare a SolarPACES activity aimed at the standardization of a methodology for electricity yield analysis of CSP plants. This core team has drafted a structural framework for a standardized methodology and the standardization process itself. The structural framework has to assure that the standardized methodology is applicable to all conceivable CSP systems, can be used on all levels of the project development process and covers all aspects affecting the electricity yield of CSP plants. Since the development of the standardized methodology is a complex task, the standardization process has been structured in work packages, and numerous international experts covering all aspects of CSP yield analysis have been asked to contribute to this process. These experts have teamed up in an international working group with the objective to develop, document and publish standardized methodologies for CSP yield analysis. This paper summarizes the intended standardization process and presents the structural framework of the methodology for CSP yield analysis.

  14. Development of methodology for horizontal axis wind turbine dynamic analysis

    NASA Technical Reports Server (NTRS)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  15. Severe accident analysis of TMI-1 seal LOCA scenario using MELCOR 1.8.2

    SciTech Connect

    Alammar, M.A.

    1994-12-31

    The pump seal LOCA scenario for Three Mile Island Unit 1 Nuclear Power Plant is analyzed using the NORC`s MELCOR 1.8.2 Code. This scenario was a major contributor to containment failure for the LOCA group as a result of the IPE Level 2 analysis which was done using EPRI`s MAAP3B code. The main purpose of this paper is to see if conclusions would have been different with regard to the impact of this scenario on containment performance had MELCOR been used instead of MAAP3B. The major areas addressed were in-vessel and ex-vessel phenomena. For the in-vessel part, three major stages of a severe accident were investigated, namely (1) thermal-hydraulic behavior before core uncovers; (2) core heatup, relocation, and hydrogen generation; and (3) lower head failure. For the ex-vessel part, the following were addressed: (1) Corium-concrete interaction; (2) containment failure; and (3) source term release. It is shown that the same conclusions are made with regard to containment performance and its impact on Level 2 results.

  16. Sensitivity analysis of a ship accident at a deep-ocean site in the northwest Atlantic

    SciTech Connect

    Kaplan, M.F.

    1985-04-01

    This report presents the results of a sensitivity analysis for an HLW ship accident occurring in the Nares Abyssal Plain in the northwestern Atlantic. Waste form release rate, canister lifetime and sorption in the water column (partition coefficients) were varied. Also investigated were the relative importance of the dose from the food chain and from seaweed in the diet. Peak individual doses and integrated collective doses for populations were the units of comparison. In accordance with international guidelines on radiological protection, the comparisons of different options were carried out over ''all time''; the study uses a million-year time frame. Partition coefficients have the most pronounced effect on collective dose of the parameters studied. Variations in partition coefficients affect the shape of the collective dose curve over the entire time frame. Peak individual doses decrease markedly when the value for the sorption of americium is increased, but show no increase when less sorption is assumed. Waste form release rates and canister lifetimes affect collective doses only in periods prior to 20,000 years. Hence, comparisons of these options need not be carried out beyond 20,000 years. Waste from release rates below 10/sup -3//yr (nominal value) affect individual doses in a linear manner, i.e., an order-of-magnitude reduction in release rate leads to an order-of-magnitude reduction in peak individual dose. Little reduction in peak individual doses is seen with canister lifetimes extended beyond the nominal 100 years. 32 refs., 14 figs., 16 tabs.

  17. Methodology for statistical analysis of SENCAR mouse skin assay data.

    PubMed Central

    Stober, J A

    1986-01-01

    Various response measures and statistical methods appropriate for the analysis of data collected in the SENCAR mouse skin assay are examined. The characteristics of the tumor response data do not readily lend themselves to the classical methods for hypothesis testing. The advantages and limitations of conventional methods of analysis and methods recommended in the literature are discussed. Several alternative response measures that were developed specifically to answer the problems inherent in the data collected in the SENCAR bioassay system are described. These measures take into account animal survival, tumor multiplicity, and tumor regression. Statistical methods for the analysis of these measures to test for a positive dose response and a dose-response relationship are discussed. Sample data from representative initiation/promotion studies are used to compare the response measures and methods of analysis. PMID:3780632

  18. Methodologies and techniques for analysis of network flow data

    SciTech Connect

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  19. Analysis of labour accidents in tunnel construction and introduction of prevention measures.

    PubMed

    Kikkawa, Naotaka; Itoh, Kazuya; Hori, Tomohito; Toyosawa, Yasuo; Orense, Rolando P

    2015-12-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  20. Traffic Analysis and Road Accidents: A Case Study of Hyderabad using GIS

    NASA Astrophysics Data System (ADS)

    Bhagyaiah, M.; Shrinagesh, B.

    2014-06-01

    Globalization has impacted many developing countries across the world. India is one such country, which benefited the most. Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. It is estimated that since 2001 there is an increase of 202 percent of two wheeler and 286 percent of four wheeler vehicles with no road expansion. Motor vehicle crashes are a common cause of death, disability and demand for emergency medical care. Globally, more than 1 million people die each year from traffic crashes and about 20-50 million are injured or permanently disabled. There has been increasing trend in road accidents in Hyderabad over a few years. GIS helps in locating the accident hotspots and also in analyzing the trend of road accidents in Hyderabad.

  1. Analysis of labour accidents in tunnel construction and introduction of prevention measures

    PubMed Central

    KIKKAWA, Naotaka; ITOH, Kazuya; HORI, Tomohito; TOYOSAWA, Yasuo; ORENSE, Rolando P.

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  2. Source terms for analysis of accidents at a high level waste repository

    SciTech Connect

    Mubayi, V.; Davis, R.E.; Youngblood, R.

    1989-01-01

    This paper describes an approach to identifying source terms from possible accidents during the preclosure phase of a high-level nuclear waste repository. A review of the literature on repository safety analyses indicated that source term estimation is in a preliminary stage, largely based on judgement-based scoping analyses. The approach developed here was to partition the accident space into domains defined by certain threshold values of temperature and impact energy density which may arise in potential accidents and specify release fractions of various radionuclides, present in the waste form, in each domain. Along with a more quantitative understanding of accident phenomenology, this approach should help in achieving a clearer perspective on scenarios important to preclosure safety assessments of geologic repositories. 18 refs., 3 tabs.

  3. Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications

    PubMed Central

    Lourenço, Célia; Turner, Claire

    2014-01-01

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

  4. Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J.; Laub, T.W.

    1992-06-01

    This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

  5. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  6. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  7. [Comparative analysis of traffic accident injuries and injuries from land mines].

    PubMed

    Hasanicevi?, E

    1998-01-01

    The observation period was from 1 January 1996 to 31 December 1997. The purpose of this research is to determine the frequency of the injuries (traffic's accidents and contact mine's) in the Emergency Medical Service at Tesanj, and their mutual relation. We made a protocol where we registered the personal data, cause, time and type of injuries, treatment and the outcome. We found out that 0.68% of injuries treated in Emergency Department's were caused by traffic accidents and contact mines's) Traffic's injuries were the dominant ones (90%). Males were injured more often than females. Children were injured's in 18%. Critical month for the traffic's accidents was May, and for the injuries by contact mines's was April. The traffic's injuries had tendency of increase, and the injuries by contact mines's had tendency of decrease in comparison with cases in 1996. In Emergency Medical Service 92% of victims arrived in the first thirty minutes after injury. The accidents 's' mostly happened between 08-12 h a.m. and 08-12 h p.m. Rarely the accidents happened between 04-08 h a.m. A head was the most injured body part in traffic accidents (35%) and lower limbs in the contact mine injuries (around 30%). At the injuries by contact mines 62.5% of amputations were at the level of lower leg. The mortality at contact mine injuries's was 17.6% and at traffic injuries 2.7%. PMID:9769641

  8. Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.

    PubMed

    Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

    2005-01-01

    A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

  9. Gene set analysis of genome-wide association studies: Methodological issues and perspectives

    E-print Network

    Blakely, Randy

    Review Gene set analysis of genome-wide association studies: Methodological issues and perspectives that gene set analysis, which tests disease association with genetic variants in a group of functionally 15 April 2011 Available online 30 April 2011 Keywords: Genome-wide association study Gene set Pathway

  10. Statistical shape analysis of tuber roots: a methodological case study on laser scanned

    E-print Network

    Rumpf, Martin

    Statistical shape analysis of tuber roots: a methodological case study on laser scanned sugar beets to the analysis of sugar beets of different cultivars. 1 Introduction For breeding and precision agriculture for larger samples of sugar beet of different cultivars. To whom correspondence should be addressed: Benedikt

  11. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    ERIC Educational Resources Information Center

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  12. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    ERIC Educational Resources Information Center

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  13. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  14. APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

  15. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  16. The epidemiology and cost analysis of patients presented to Emergency Department following traffic accidents

    PubMed Central

    Karadana, Gökçe Akgül; Aksu, Nalan Metin; Akka?, Meltem; Akman, Canan; Üzümcügil, Ak?n; Özmen, M. Mahir

    2013-01-01

    Background Traffic accidents are ranked first as the cause of personal injury throughout the world. The high number of traffic accidents yielding injuries and fatalities makes them of great importance to Emergency Departments. Material/Methods Patients admitted to Hacettepe University Faculty of Medicine Adult Emergency Department due to traffic accidents were investigated epidemiologically. Differences between groups were evaluated by Kruskall-Wallis, Mann-Whitney, and Wilcoxon tests. A value of p<0.05 was accepted as statistically significant. Results We included 2003 patients over 16 years of age. The mean age was 39.6±16.1 and 55% were males. Admissions by ambulance and due to motor vehicle accidents were the most common. In 2004 the rate of traffic accidents (15.3%) was higher than the other years, the most common month was May (10.8%), and the most common time period was 6 pm to 12 am (midnight). About half of the patients (51.5%) were admitted in the first 30 minutes. Life-threatening condition was present in 9.6% of the patients. Head trauma was the most common type of trauma, with the rate of 18.3%. Mortality rate was 81.8%. The average length of hospital stay was 403 minutes (6.7 hours) and the average cost per patient was 983±4364 TL. Conclusions Further studies are needed to compare the cost found in this study with the mean cost for Turkey. However, the most important step to reduce the direct and indirect costs due to traffic accidents is the prevention of these accidents. PMID:24316815

  17. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  18. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  19. Accident investigation

    NASA Technical Reports Server (NTRS)

    Laynor, William G. Bud

    1987-01-01

    The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

  20. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    SciTech Connect

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

  1. Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine

    SciTech Connect

    Georgievskiy, Vladimir

    2007-07-01

    It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following: - During the first year after the Chernobyl accident 75-93% of Commitment Effective Dose had been formed; - During the first year after the Chernobyl accident 85-90% of damage from radiation exposure had been formed. During the next 50 years (the late phase of accident) only 10-15% of damage from radiation exposure will have been formed; - Remedial actions (agricultural remedial actions as most effective) in Ukraine are intended for reduction of the damage from consumption of production which is contaminated in the late phase of accident. I.e. agricultural remedial actions have been intended for minimization only 10 % of the total damage from radiation exposure; - Medical countermeasures can minimize radiation exposure damage by an order of magnitude greater than agricultural countermeasures. - Thus, retrospection of nuclear accident has essentially changed type of remedial actions and has given a chance to increase effectiveness of spending by an order of magnitude. This example illustrates that in order to optimize remedial actions it is required to use data of retrospection of nuclear accidents in all cases when monitoring in the early and (or) intermediate phases is unsatisfactory. (author)

  2. Analysis of 121 fatal passenger car-adult pedestrian accidents in China.

    PubMed

    Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

    2014-10-01

    To study the characteristics of fatal vehicle-pedestrian accidents in China?a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

  3. Accident analysis of large-scale technological disasters applied to an anaesthetic complication.

    PubMed

    Eagle, C J; Davies, J M; Reason, J

    1992-02-01

    The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

  4. Methodology for Establishment of Integrated Flood Analysis System

    NASA Astrophysics Data System (ADS)

    Kim, B.; Sanders, B. F.; Kim, K.; Han, K.; Famiglietti, J. S.

    2012-12-01

    Flood risk management efforts face considerable uncertainty in flood hazard delineation as a consequence of changing climatic conditions including shifts in precipitation, soil moisture, and land uses. These changes can confound efforts to characterize flood impacts over decadal time scales and thus raise questions about the true benefits and drawbacks of alternative flood management projects including those of a structural and non-structural nature. Here we report an integrated flood analysis system that is designed to bring climate change information into flood risk context and characterize flood hazards in both rural and urban areas. Distributed rainfall-runoff model, one-dimensional (1D) NWS-FLDWAV model, 1D Storm Water Management Model (SWMM) and two-dimensional (2D) BreZo model are coupled. Distributed model using the multi-directional flow allocation and real time updating is used for rainfall-runoff analysis in ungauged watershed and its outputs are taken as boundary conditions to the FLDWAV model which was employed for 1D river hydraulic routing and predicting the overflow discharge at levees which were overtopped. In addition, SWMM is chosen to analyze storm sewer flow in urban areas and BreZo is used to estimate the inundation zones, depths and velocities due to the surcharge flow at sewer system or overflow at levees on the land surface. The overflow at FLDWAV or surcharged flow at SWMM becomes point sources in BreZo. Applications in Korea and California are presented.

  5. Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models

    SciTech Connect

    S. A. Eide; D. M. Rasmuson; C. L. Atwood

    2008-09-01

    The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

  6. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    NASA Technical Reports Server (NTRS)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  7. Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.

    SciTech Connect

    Salay, Michael; Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

    2008-10-01

    Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

  8. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    SciTech Connect

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  9. Methodology for analysis and simulation of large multidisciplinary problems

    NASA Technical Reports Server (NTRS)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  10. The failure analysis of composite material flight helmets as an aid in aircraft accident investigation.

    PubMed

    Caine, Y G; Bain-Ungerson, O; Schochat, I; Marom, G

    1991-06-01

    Understanding why a flying helmet fails to maintain its integrity during an accident can contribute to an understanding of the mechanism of injury and even of the accident itself. We performed a post-accident evaluation of failure modes in glass and aramid fibre-reinforced composite helmets. Optical and microscopic (SEM) techniques were employed to identify specific fracture mechanisms. They were correlated with the failure mode. Stress and energy levels were estimated from the damage extent. Damage could be resolved into distinct impact, flexure and compression components. Delamination was identified as a specific mode, dependent upon the matrix material and bonding between the layers. From the energy dissipated in specific fracture mechanisms we calculated the minimum total energy imparted to the helmet-head combination and the major injury vector (MIV) direction and magnitude. The level of protection provided by the helmet can also be estimated. PMID:1859350

  11. Analysis of hospitalization occurred due to motorcycles accidents in São Paulo city

    PubMed Central

    Gorios, Carlos; Armond, Jane de Eston; Rodrigues, Cintia Leci; Pernambuco, Henrique; Iporre, Ramiro Ortiz; Colombo-Souza, Patrícia

    2015-01-01

    OBJECTIVE: To characterize the motorcycle accidents occurred in the city of São Paulo, SP, Brazil in the year 2013, with emphasis on information about hospital admissions from SIH/SUS. METHODS: This is a retrospective cross-sectional study. The study covered 5,597 motorcyclists traumatized in traffic accident during the year 2013 occurred in the city of São Paulo. A survey was conducted using secondary data from the Information System of Hospitalization Health System (SIH/SUS). RESULTS: In 2013, in the city of São Paulo there were 5,597 admissions of motorcyclists traumatized in traffic accidents, of which 89.8% were male. The admission diagnosis were: leg fracture, femur fracture, and intracranial injury. CONCLUSION: This study confirms other preliminary studies on several points, among which stands out the highest prevalence of male young adults. Level of Evidence II, Retrospective Study. PMID:26327804

  12. A Comprehensive Analysis of the X-15 Flight 3-65 Accident

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

    2014-01-01

    The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

  13. Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II

    NASA Astrophysics Data System (ADS)

    Hu, G.; Zhao, S.; Ruan, K.

    2012-01-01

    In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

  14. Rough set approach for accident chains exploration.

    PubMed

    Wong, Jinn-Tsai; Chung, Yi-Shih

    2007-05-01

    This paper presents a novel non-parametric methodology--rough set theory--for accident occurrence exploration. The rough set theory allows researchers to analyze accidents in multiple dimensions and to model accident occurrence as factor chains. Factor chains are composed of driver characteristics, trip characteristics, driver behavior and environment factors that imply typical accident occurrence. A real-world database (2003 Taiwan single auto-vehicle accidents) is used as an example to demonstrate the proposed approach. The results show that although most accident patterns are unique, some accident patterns are significant and worth noting. Student drivers who are young and less experienced exhibit a relatively high possibility of being involved in off-road accidents on roads with a speed limit between 51 and 79 km/h under normal driving circumstances. Notably, for bump-into-facility accidents, wet surface is a distinctive environmental factor. PMID:17166475

  15. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

  16. Methodology for object-oriented real-time systems analysis and design: Software engineering

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  17. Accident analysis for high-level waste management alternatives in the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement

    SciTech Connect

    Folga, S.; Mueller, C.; Roglans-Ribas, J.

    1994-02-01

    A comparative generic accident analysis was performed for the programmatic alternatives for high-level waste (HLW) management in the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement (EM PEIS). The key facilities and operations of the five major HLW management phases were considered: current storage, retrieval, pretreatment, treatment, and interim canister storage. A spectrum of accidents covering the risk-dominant accidents was analyzed. Preliminary results are presented for HLW management at the Hanford site. A comparison of these results with those previously advanced shows fair agreement.

  18. A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C, M.

    2007-01-01

    Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

  19. Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.

    ERIC Educational Resources Information Center

    Dunwoody, Sharon; And Others

    Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

  20. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  1. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    SciTech Connect

    Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

  2. MELCOR ACCIDENT ANALYSIS FOR ARIES-ACT Paul W. Humrickhouse, Brad J. Merrill

    E-print Network

    California at San Diego, University of

    heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model of the confinement building, and is thus able to transfer heat from the circulating water to ambient air. Fig. 2 accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with Li

  3. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  4. Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.

    ERIC Educational Resources Information Center

    Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

    2003-01-01

    This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

  5. Analysis of Folk Tales for Prosocial Behavior among Slaves: Exploration of a Methodology.

    ERIC Educational Resources Information Center

    Harrison, Algea Othella

    The purpose of this paper is to discuss some of the methodological problems in using a content analysis of slave folk tales as a source for incidences of prosocial behaviors. Forty tales taken from four collections of black folktales will be included in this study. The categories for scoring prosocial behavior include cooperativeness, altruism,…

  6. Using Exergy Analysis Methodology to Assess the Heating Efficiency of an Electric Heat Pump 

    E-print Network

    Ao, Y.; Duanmu, L.; Shen, S.

    2006-01-01

    The authors, using exergy analysis methodology, propose that it should consider not only the COP (coefficient of Performance) value of the electric power heat pump set (EPHPS/or HP set), but also the exergy loss at the heating exchanger of the HP...

  7. METHODOLOGY ARTICLE Open Access SNP-based pathway enrichment analysis for

    E-print Network

    Yu, Zhaoxia

    METHODOLOGY ARTICLE Open Access SNP-based pathway enrichment analysis for genome-wide association within a gene and multiple genes within a pathway. Most current methods choose the most significant SNP a SNP-based pathway enrichment method for GWAS studies. The method consists of the following two main

  8. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    PubMed Central

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  9. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    SciTech Connect

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  10. A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining

    E-print Network

    Xie, Mengjun

    A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining Jiang Bian, Josh M Informatics Brain Imaging Research Center, Psychiatric Research Institute University of Arkansas for Medical and functional brain connectivity networks and has helped researchers conceive the effects of neurological

  11. Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys

    Cancer.gov

    Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

  12. Occupational accidents aboard merchant ships

    PubMed Central

    Hansen, H; Nielsen, D; Frydenberg, M

    2002-01-01

    Objectives: To investigate the frequency, circumstances, and causes of occupational accidents aboard merchant ships in international trade, and to identify risk factors for the occurrence of occupational accidents as well as dangerous working situations where possible preventive measures may be initiated. Methods: The study is a historical follow up on occupational accidents among crew aboard Danish merchant ships in the period 1993–7. Data were extracted from the Danish Maritime Authority and insurance data. Exact data on time at risk were available. Results: A total of 1993 accidents were identified during a total of 31 140 years at sea. Among these, 209 accidents resulted in permanent disability of 5% or more, and 27 were fatal. The mean risk of having an occupational accident was 6.4/100 years at sea and the risk of an accident causing a permanent disability of 5% or more was 0.67/100 years aboard. Relative risks for notified accidents and accidents causing permanent disability of 5% or more were calculated in a multivariate analysis including ship type, occupation, age, time on board, change of ship since last employment period, and nationality. Foreigners had a considerably lower recorded rate of accidents than Danish citizens. Age was a major risk factor for accidents causing permanent disability. Change of ship and the first period aboard a particular ship were identified as risk factors. Walking from one place to another aboard the ship caused serious accidents. The most serious accidents happened on deck. Conclusions: It was possible to clearly identify work situations and specific risk factors for accidents aboard merchant ships. Most accidents happened while performing daily routine duties. Preventive measures should focus on workplace instructions for all important functions aboard and also on the prevention of accidents caused by walking around aboard the ship. PMID:11850550

  13. Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

    2014-05-01

    Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

  14. An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident

    SciTech Connect

    El-Genk, M.S.; Paramonov, D. )

    1993-01-10

    An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

  15. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1993-01-01

    In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

  16. CORMLT code for the analysis of degraded core accidents. Computer code manual. [PWR

    SciTech Connect

    Denny, V.E.

    1984-12-01

    A computer code (CORMLT) has been developed to predict the effects of bouyancy-driven convection on the progression of core-degrading accidents in PWR vessels. Thermal/hydraulics modeling includes the downcomer/bottom-head regions, as well as the upper vessel and adjacent hot-leg portions of the primary coolant system for which gas communication is limited to the intervening discharge nozzles (so-called dead-end volumes). CORMLT requires flow rates and temperatures of any water feed (to the downcomer) versus time. CORMLT provides composition, enthalpy, temperature, and flow rate of steam/hydrogen mixtures within the vessel above the (receding) water surface, as well as estimates of these quantities for interaction between the plenum and the rest of the PCS. CORMLT also provides graphical representations for the morphological behavior of the progression of core meltdown accidents.

  17. Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.

    SciTech Connect

    Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

    2002-05-01

    This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

  18. A new analysis methodology for the motion of self-propelled particles and its application

    NASA Astrophysics Data System (ADS)

    Byun, Young-Moo; Lammert, Paul; Crespi, Vincent

    2011-03-01

    The self-propelled particle (SPP) on the microscale in the solution is a growing field of study, which has a potential to be used for nanomedicine and nanorobots. However, little detailed quantitative analysis on the motion of the SPP has been performed so far because its self-propelled motion is strongly coupled to Brownian motion, which makes the extraction of intrinsic propulsion mechanisms problematic, leading to inconsistent conclusions. Here, we present a novel way to decompose the motion of the SPP into self-propelled and Brownian components; accurate values for self-propulsion speed and diffusion coefficients of the SPP are obtained for the first time. Then, we apply our analysis methodology to ostensible chemotaxis of SPP, and reveal the actual (non-chemotactic) mechanism of the phenomenon, demonstrating that our analysis methodology is a powerful and reliable tool.

  19. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  20. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  1. Thermodynamic analysis of spent pyrochemical salts in the stored condition and in viable accident scenarios

    SciTech Connect

    Axler, K.M.

    1994-03-01

    This study involves examining ``spent`` electrorefining (ER) salts in the form present after usage (as stored), and then after exposure to water in a proposed accident scenario. Additionally, the equilibrium composition of the salt after extended exposure to air was also calculated by computer modeling and those results are also presented herein. It should be noted that these salts are extremely similar to spent MSE salts from the Rocky Flats MSE campaigns using NaCl-KCl- MgCl{sub 2}.

  2. Analysis method of heavy accidents stability for the lead cooled fast reactors

    SciTech Connect

    Delpech, M.; Alekseev, P.N.; Ilyin, D.A.

    1993-12-31

    This paper describes the improvement of the potential of the safety for the projects of lead-cooled fast breeder reactors which are in progress in the Russian Federation. The calculations of the accidents and the responses to perturbations of the steady state are performed. The improvement of the core behavior by optimization of the feedback coefficients and by insertion of passive system is realized using the description of the safety potential for these new projects.

  3. Third annual Warren K. Sinclair keynote address: retrospective analysis of impacts of the Chernobyl accident.

    PubMed

    Balonov, Mikhail

    2007-11-01

    The accident at the Chernobyl Nuclear Power Plant in 1986 was the most severe in the history of the nuclear industry, causing a huge release of radionuclides over large areas of Europe. The recently completed Chernobyl Forum concluded that after a number of years, along with reduction of radiation levels and accumulation of humanitarian consequences, severe social and economic depression of the affected regions and associated psychological problems of the general public and the workers had become the most significant problem to be addressed by the authorities. The majority of the >600,000 emergency and recovery operation workers and five million residents of the contaminated areas in Belarus, Russia, and Ukraine received relatively minor radiation doses which are comparable with the natural background levels. An exception is a cohort of several hundred emergency workers who received high radiation doses and of whom 28 persons died in 1986 due to acute radiation sickness. Apart from the dramatic increase in thyroid cancer incidence among those exposed to radioiodine at a young age and some increase of leukemia in the most exposed workers, there is no clearly demonstrated increase in the somatic diseases due to radiation. There was, however, an increase in psychological problems among the affected population, compounded by the social disruption that followed the break-up of the Soviet Union. Despite the unprecedented scale of the Chernobyl accident, its consequences on the health of people are far less severe than those of the atomic bombings of the cities of Hiroshima and Nagasaki. Studying the consequences of the Chernobyl accident has made an invaluable scientific contribution to the development of nuclear safety, radioecology, radiation medicine and protection, and also the social sciences. The Chernobyl accident initiated the global nuclear and radiation safety regime. PMID:18049216

  4. Accident simulation and consequence analysis in support of MHTGR safety evaluations

    SciTech Connect

    Ball, S.J.; Wichner, R.P.; Smith, O.L.; Conklin, J.C. ); Barthold, W.P. )

    1991-01-01

    This paper summarizes research performed at Oak Ridge National Laboratory (ORNL) to assist the Nuclear Regulatory Commission (NRC) in preliminary determinations of licensability of the US Department of Energy (DOE) reference design of a standard modular high-temperature gas-cooled reactor (MHTGR). The work described includes independent analyses of core heatup and steam ingress accidents, and the reviews and analyses of fuel performance and fission product transport technology.

  5. SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    Brad J. Merrill; Shannon M Bragg-Sitton

    2013-09-01

    The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

  6. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

    2014-03-01

    The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

  8. Hazards and accident analyses, an integrated approach, for the Plutonium Facility at Los Alamos National Laboratory

    SciTech Connect

    Pan, P.Y.; Goen, L.K.; Letellier, B.C.; Sasser, M.K.

    1995-07-01

    This paper describes an integrated approach to perform hazards and accident analyses for the Plutonium Facility at Los Alamos National Laboratory. A comprehensive hazards analysis methodology was developed that extends the scope of the preliminary/process hazard analysis methods described in the AIChE Guidelines for Hazard Evaluations. Results fro the semi-quantitative approach constitute a full spectrum of hazards. For each accident scenario identified, there is a binning assigned for the event likelihood and consequence severity. In addition, each accident scenario is analyzed for four possible sectors (workers, on-site personnel, public, and environment). A screening process was developed to link the hazard analysis to the accident analysis. Specifically the 840 accident scenarios were screened down to about 15 accident scenarios for a more through deterministic analysis to define the operational safety envelope. The mechanics of the screening process in the selection of final scenarios for each representative accident category, i.e., fire, explosion, criticality, and spill, is described.

  9. 77 FR 31600 - Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal Pell Grant, Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ...EDUCATION Federal Need Analysis Methodology for the 2013-2014...Perkins Loan, Federal Work-Study, Federal Supplemental...Program; 84.033 Federal Work-Study Programs; 84...statutory ``Federal Need Analysis Methodology'' to determine...Perkins Loan, Federal Work-Study, Federal...

  10. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    SciTech Connect

    D. A. Brownson

    2002-09-26

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002).

  11. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    SciTech Connect

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  12. Methodological considerations for the harmonization of non-cholesterol sterol bio-analysis.

    PubMed

    Mackay, Dylan S; Jones, Peter J H; Myrie, Semone B; Plat, Jogchum; Lütjohann, Dieter

    2014-04-15

    Non-cholesterol sterols (NCS) are used as surrogate markers of cholesterol metabolism which can be measured from a single blood sample. Cholesterol precursors are used as markers of endogenous cholesterol synthesis and plant sterols are used as markers of cholesterol absorption. However, most aspects of NCS analysis show wide variability among researchers within the area of biomedical research. This variability in methodology is a significant contributor to variation between reported NCS values and hampers the confidence in comparing NCS values across different research groups, as well as the ability to conduct meta-analyses. This paper summarizes the considerations and conclusions of a workshop where academic and industrial experts met to discuss NCS measurement. Highlighted is why each step in the analysis of NCS merits critical consideration, with the hopes of moving toward more standardized and comparable NCS analysis methodologies. Alkaline hydrolysis and liquid-liquid extraction of NCS followed by parallel detection on GC-FID and GC-MS is proposed as an ideal methodology for the bio-analysis of NCS. Furthermore the importance of cross-comparison or round robin testing between various groups who measure NCS is critical to the standardization of NCS measurement. PMID:24674990

  13. Improved methodology for integral analysis of advanced reactors employing passive safety

    NASA Astrophysics Data System (ADS)

    Muftuoglu, A. Kursad

    After four decades of experience with pressurized water reactors, a new generation of nuclear plants are emerging. These advanced designs employ passive safety which relies on natural forces, such as gravity and natural circulation. The new concept of passive safety also necessitates improvement in computational tools available for best-estimate analyses. The system codes originally designed for high pressure conditions in the presence of strong momentum sources such as pumps are challenged in many ways. Increased interaction of the primary system with the containment necessitates a tool for integral analysis. This study addresses some of these concerns. An improved tool for integral analysis coupling primary system with containment calculation is also presented. The code package is based on RELAP5 and CONTAIN programs, best-estimate thermal-hydraulics code for primary system analysis and containment code for containment analysis, respectively. The suitability is demonstrated with a postulated small break loss of coolant accident analysis of Westinghouse AP600 plant. The thesis explains the details of the analysis including the coupling model.

  14. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  15. Analysis of Japanese radionuclide monitoring data of food before and after the Fukushima nuclear accident.

    PubMed

    Merz, Stefan; Shozugawa, Katsumi; Steinhauser, Georg

    2015-03-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima (137)Cs and (90)Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in (137)Cs and vegetarian produce was usually higher in (90)Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of (90)Sr being 10% of the respective (137)Cs concentrations may soon be at risk, as the (90)Sr/(137)Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the (90)Sr content of Japanese foods. PMID:25621976

  16. Analysis of Japanese Radionuclide Monitoring Data of Food Before and After the Fukushima Nuclear Accident

    PubMed Central

    2015-01-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima 137Cs and 90Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in 137Cs and vegetarian produce was usually higher in 90Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of 90Sr being 10% of the respective 137Cs concentrations may soon be at risk, as the 90Sr/137Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the 90Sr content of Japanese foods. PMID:25621976

  17. Methodology for social accountability: multiple methods and feminist, poststructural, psychoanalytic discourse analysis.

    PubMed

    Phillips, D A

    2001-06-01

    Bridging the gap between the individual and social context, methodology that aims to surface and explore the regulatory function of discourse on subjectivity production moves nursing research beyond the individual level in order to theorize social context and its influence on health and well-being. This article describes the feminist, poststructural, psychoanalytic discourse analysis and multiple methods used in a recent study exploring links between cultural discourses of masculinity, performativity of masculinity, and practices of male violence. PMID:11393249

  18. Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results

    SciTech Connect

    LAVENDER, J.C.

    2000-10-17

    RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

  19. SACO-1: a fast-running LMFBR accident-analysis code

    SciTech Connect

    Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

    1980-01-01

    SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

  20. Transient Analysis for Evaluating the Potential Boiling in the High Elevation Emergency Cooling Units of PWR Following a Hypothetical Loss of Coolant Accident (LOCA) and Subsequent Water Hammer Due to Pump Restart

    SciTech Connect

    Husaini, S. Mahmood; Qashu, Riyad K.

    2004-07-01

    The Generic Letter GL-96-06 issued by the U.S. Nuclear Regulatory Commission (NRC) required the utilities to evaluate the potential for voiding in their Containment Emergency Cooling Units (ECUs) due to a hypothetical Loss Of Coolant Accident (LOCA) or a Main Steam Line Break (MSLB) accompanied by the Loss Of Offsite Power (LOOP). When the offsite power is restored, the Component Cooling Water (CCW) pumps restart causing water hammer to occur due to cavity closure. Recently EPRI (Electric Power Research Institute) performed a research study that recommended a methodology to mitigate the water hammer due to cavity closure. The EPRI methodology allows for the cushioning effects of hot steam and released air, which is not considered in the conventional water column separation analysis. The EPRI study was limited in scope to the evaluation of water hammer only and did not provide any guidance for evaluating the occurrence of boiling and the extent of voiding in the ECU piping. This paper presents a complete methodology based on first principles to evaluate the onset of boiling. Also, presented is a methodology for evaluating the extent of voiding and the water hammer resulting from cavity closure by using an existing generalized computer program that is based on the Method of Characteristics. The EPRI methodology is then used to mitigate the predicted water hammer. Thus it overcomes the inherent complications and difficulties involved in performing hand calculations for water hammer. The heat transfer analysis provides an alternative to the use of very cumbersome modeling in using CFD (computational fluid dynamics) based computer programs. (authors)

  1. Methodology for the analysis of pollutant emissions from a city bus

    NASA Astrophysics Data System (ADS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  2. Source term and radiological consequences of the Chernobyl accident

    SciTech Connect

    Mourad, R.; Snell, V.

    1987-01-01

    The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

  3. A systematic review and analysis of factors associated with methodological quality in laparoscopic randomized controlled trials.

    PubMed

    Antoniou, Stavros Athanasios; Andreou, Alexandros; Antoniou, George Athanasios; Bertsias, Antonios; Köhler, Gernot; Koch, Oliver Owen; Pointner, Rudolph; Granderath, Frank-Alexander

    2015-01-01

    Several methods for assessment of methodological quality in randomized controlled trials (RCTs) have been developed during the past few years. Factors associated with quality in laparoscopic surgery have not been defined till date. The aim of this study was to investigate the relationship between bibliometric and the methodological quality of laparoscopic RCTs. The PubMed search engine was queried to identify RCTs on minimally invasive surgery published in 2012 in the 10 highest impact factor surgery journals and the 5 highest impact factor laparoscopic journals. Eligible studies were blindly assessed by two independent investigators using the Scottish Intercollegiate Guidelines Network (SIGN) tool for RCTs. Univariate and multivariate analyses were performed to identify potential associations with methodological quality. A total of 114 relevant RCTs were identified. More than half of the trials were of high or acceptable quality. Half of the reports provided information on comparative demo graphic data and only 21% performed intention-to-treat analysis. RCTs with sample size of at least 60 patients presented higher methodological quality (p = 0.025). Upon multiple regression, reporting on preoperative care and the experience level of surgeons were independent factors of quality. PMID:25896540

  4. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-04-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations performed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  5. MossWinn—methodological advances in the field of Mössbauer data analysis

    NASA Astrophysics Data System (ADS)

    Klencsár, Zoltán

    2013-04-01

    The methodology of Mössbauer data analysis has been advanced via the development of a novel scientific database system concept and its realization in the field of Mössbauer spectroscopy, as well as by the application of parallel computing techniques for the enhancement of the efficiency of various processes encountered in the practice of Mössbauer data handling and analysis. The present article describes the new database system concept along with details of its realization in the form of the MossWinn Internet Database (MIDB), and illustrates the performance advantage that may be realized on multi-core processor systems by the application of parallel algorithms for the implementation of database system functions.

  6. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    SciTech Connect

    Evans, J.S.; Abrahmson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  7. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    NASA Astrophysics Data System (ADS)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  8. Analysis of 303 Road Traffic Accident Victims Seen Dead on Arrival at Emergency Room-Assir Central Hospital

    PubMed Central

    Batouk, Abdul N.; Abu-Eisheh, Nader; Abu-Eshy, Saeed; Al-Shehri, Mohammad; AI-Naami, Mohammad; Jastaniah, Suleiman

    1996-01-01

    Background: Although Rood Traffic Accident (RTA) is a noticeable common cause of death in Saudi Arabia, there is no published data showing the relative frequency of this disease as a cause of death. Aim of the study: This study attempted to find out the relative frequency of RTA as a cause of death. Also, to identify age groups at risk as well as make some inferences from the different types of injuries seen. Methodology: In a period of over a four and half years, 574 patients were seen dead on arrival at the Emergency Department of Assir Central Hospital, Abha, Saudi Arabia. Of these, 303 (52.8%) were victims of RTA. Results: The 303 victims revealed a male to female ratio of 14:1, Saudi nationals of 69% and age range of 3 months - 85 years (mean = 34.25 years). The peak age group was between 21 and 49 years and the peak period of presentation at the Emergency Department was between 12:00 noon and 18:00 hours. The month of ten in Hegira Calendar represented the peak period; a significant (P<0.05) seasonal variations was also seen, summer being the highest. Clinical assessment of the victims revealed that head and neck injuries were the commonest followed by chest injuries. Conclusion: RTA is the primary cause of death among dead on arrival cases affecting the most active and productive age group. The study recommended the implementation of pre -hospital emergency medical system. PMID:23008545

  9. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    PubMed Central

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  10. A Gap Analysis Methodology for Collecting Crop Genepools: A Case Study with Phaseolus Beans

    PubMed Central

    Ramírez-Villegas, Julián; Khoury, Colin; Jarvis, Andy; Debouck, Daniel Gabriel; Guarino, Luigi

    2010-01-01

    Background The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis) of crop wild relatives as a means to guide efficient and effective collecting activities. Methodology/Principal Findings The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5%) are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap “hotspots”, representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. Conclusions/Significance Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding). Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources. PMID:20976009

  11. Soil moisture retrieval from multi-instrument observations: Information content analysis and retrieval methodology

    NASA Astrophysics Data System (ADS)

    Kolassa, J.; Aires, F.; Polcher, J.; Prigent, C.; Jimenez, C.; Pereira, J. M.

    2013-05-01

    algorithm has been developed that employs neural network technology to retrieve soil moisture from multi-wavelength satellite observations (active/passive microwave, infrared, and visible). This represents the first step in the development of a methodology aiming to combine beneficial aspects of existing retrieval schemes. Several quality metrics have been developed to assess the performance of a retrieval product on different spatial and temporal scales. Additionally, an innovative approach to estimate the retrieval uncertainty has been proposed. An information content analysis of different satellite observations showed that active microwave observations are best suited to capture the soil moisture temporal variability, while the amplitude of the surface temperature diurnal cycle is best suited to capture the spatial variability. In a synergy analysis, it has been found that through the combination of all observations the retrieval uncertainty could be reduced by 13%. Furthermore, it was found that synergy benefits are significantly larger using a data fusion approach compared to an a posteriori combination of retrieval products, supporting the combination of different retrieval methodology aspects in a single algorithm. In a comparison with model data, it was found that the proposed methodology also shows potential to be used for the evaluation of modeled soil moisture. A comparison with in situ observations showed that the algorithm is well able to capture soil moisture spatial variabilities. It was concluded that the temporal performance can be improved through incorporation of other existing retrieval approaches.

  12. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    PubMed

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638

  13. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    SciTech Connect

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein.

  14. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    PubMed Central

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%. PMID:26504638

  15. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    PubMed

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition). PMID:24760596

  16. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    SciTech Connect

    Hugo, Jacques

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  17. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    NASA Technical Reports Server (NTRS)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  18. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  19. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  20. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  1. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  2. Tools for improving safety management in the Norwegian Fishing Fleet occupational accidents analysis period of 1998-2006.

    PubMed

    Aasjord, Halvard L

    2006-01-01

    Reporting of human accidents in the Norwegian Fishing Fleet has always been very difficult because there has been no tradition in making reports on all types of working accidents among fishermen, if the accident does not seem to be very serious or there is no economical incentive to report. Therefore reports are only written when the accidents are serious or if the fisherman is reported sick. Reports about an accident are sent to the insurance company, but another report should also be sent to the Norwegian Maritime Directorate (NMD). Comparing of data from one former insurance company and NMD shows that the real numbers of injuries or serious accidents among Norwegian fishermen could be up to two times more than the numbers reported to NMD. Special analyses of 1690 accidents from the so called PUS-database (NMD) for the period 1998-2002, show that the calculated risk was 23.6 accidents per 1000 man-years. This is quite a high risk level, and most of the accidents in the fishing fleet were rather serious. The calculated risks are highest for fishermen on board the deep sea fleet of trawlers (28.6 accidents per 1000 man-years) and also on the deep sea fleet of purse seiners (28.9 accidents per 1000 man-years). Fatal accidents over a longer period of 51.5 years from 1955 to 2006 are also roughly analysed. These data from SINTEF's own database show that the numbers of fatal accidents have been decreasing over this long period, except for the two periods 1980-84 and 1990-94 where we had some casualties with total losses of larger vessels with the loss of most of the crew, but also many others typical work accidents on smaller vessels. The total numbers of registered Norwegian fishermen and also the numbers of man-years have been drastically reduced over the 51.5 years from 1955 to 2006. The risks of fatal accidents have been very steady over time at a high level, although there has been a marked risk reduction since 1990-94. For the last 8.5-year period of January 1998-July 2006 the numbers of fatal accidents and calculated risks are analysed for three main fleet groups. The highest risk factor of 24.8 fatal accidents per 10.000 man years is found in the smaller fleet, length of vessel (Loa) < 13 meters. This is 4.1 times higher than in the medium fleet (13 < Loa < 28 meters) and 11.3 times higher risk factor than in the deep sea fleet (Loa > 28 meter). PMID:17312696

  3. Development of a computer-aided fault tree synthesis methodology for quantitative risk analysis in the chemical process industry 

    E-print Network

    Wang, Yanjun

    2005-02-17

    -1 DEVELOPMENT OF A COMPUTER-AIDED FAULT TREE SYNTHESIS METHODOLOGY FOR QUANTITATIVE RISK ANALYSIS IN THE CHEMICAL PROCESS INDUSTRY A Dissertation by YANJUN WANG Submitted to the Office of Graduate Studies of Texas A&M University in partial... fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY December 2004 Major Subject: Chemical Engineering DEVELOPMENT OF A COMPUTER-AIDED FAULT TREE SYNTHESIS METHODOLOGY FOR QUANTITATIVE RISK ANALYSIS...

  4. [Accidents and acts of violence in Brazil: I--Analysis of mortality data].

    PubMed

    Jorge, M H; Gawryszewski, V P; Latorre, M do R

    1997-08-01

    External causes are an important cause of death in almost all countries. They are always the second or third in the mortality ranking, but their distribution according to type varies from country to country. Mortality due to external causes by type, gender and age, for Brazil as a whole and for state capitals specifically, is analysed. Mortality rates and the proportional mortality from 1977 to 1994 were calculated. The results showed that the number of deaths due to external causes has almost doubled from 1977 to 1994 and nowadays this is the second cause of death in Brazil. The mortality rate, in 1991, was 69.8 per 100,000 inhabitants and the highest increase was in the male rates. The male rates are almost 4.5 times greater than the female ones. The first cause of death among people from 5 to 39 years old is external causes, and the majority occur between 15 and 19 years of age (65% of the deaths by external causes). Besides the growth in itself it also seems that a shift of deaths to hower ages is occurring. Both mortality by traffic accidents and that by homicide have increased over the period from 1977 to 1994. Suicides have been stable and "other external causes" have increased slowly, especially due to falls and drowning. The mortality rates for external causes in state capitals are higher than the average for Brazil as a whole, except for some northeastern capitals. The rates for the capitals in the northern region are the highest in Brazil. In the northeastern region, only Recife, Maceió and Salvador have high rates. In the southeast, Vitória, Rio de Janeiro and S. Paulo have the highest rates in the country but Belo Horizonte's rates are declining. In the southern region all the capitals showed a growth in the rates as also in the capitals of the West-central region. The growth of mortality due to external causes type of external cause is different in these capitals. Suicide is not a public health problem in Brazil nor the state capitals. Traffic accidents are a major problem in Vitória, Goiânia, Macapá, the Distrito Federal and Curitiba. Homicides have increased greatly in Porto Velho, Rio Branco, Recife, S. Luís, Vitória, S. Paulo, Curitiba, Porto Alegre, Cuiabá and the Distrito Federal. The mortality due to external causes in Brazil has become a major public health problem, especially because of homicides. It is important to emphasize that the quality of the mortality data on external causes is not the same for all capitals, because it is a question very closely related to the quality of legal information. PMID:9595755

  5. An Analysis Methodology for the Gamma-ray Large Area Space Telescope

    NASA Technical Reports Server (NTRS)

    Morris, Robin D.; Cohen-Tanugi, Johann

    2004-01-01

    The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

  6. Shipping container response to severe highway and railway accident conditions: Appendices

    SciTech Connect

    Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

    1987-02-01

    Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

  7. Episode analysis of deposition of radiocesium from the Fukushima Daiichi nuclear power plant accident.

    PubMed

    Morino, Yu; Ohara, Toshimasa; Watanabe, Mirai; Hayashi, Seiji; Nishizawa, Masato

    2013-03-01

    Chemical transport models played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. However, model results could not be sufficiently evaluated because of limited observational data. We assess the model performance to simulate the deposition patterns of radiocesium ((137)Cs) by making use of airborne monitoring survey data for the first time. We conducted ten sensitivity simulations to evaluate the atmospheric model uncertainties associated with key model settings including emission data and wet deposition modules. We found that simulation using emissions estimated with a regional-scale (? 500 km) model better reproduced the observed (137)Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (? 50 km) or global-scale models. In addition, simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed (137)Cs deposition rates in high-deposition areas (? 10 kBq m(-2)) within 1 order of magnitude and showed that deposition of radiocesium over land occurred predominantly during 15-16, 20-23, and 30-31 March 2011. PMID:23391028

  8. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. . School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  9. Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system

    NASA Technical Reports Server (NTRS)

    Wilmington, R. P.; Klute, Glenn K. (editor); Carroll, Amy E. (editor); Stuart, Mark A. (editor); Poliner, Jeff (editor); Rajulu, Sudhakar (editor); Stanush, Julie (editor)

    1992-01-01

    Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

  10. Supplemental analysis of accident sequences and source terms for waste treatment and storage operations and related facilities for the US Department of Energy waste management programmatic environmental impact statement

    SciTech Connect

    Folga, S.; Mueller, C.; Nabelssi, B.; Kohout, E.; Mishima, J.

    1996-12-01

    This report presents supplemental information for the document Analysis of Accident Sequences and Source Terms at Waste Treatment, Storage, and Disposal Facilities for Waste Generated by US Department of Energy Waste Management Operations. Additional technical support information is supplied concerning treatment of transuranic waste by incineration and considering the Alternative Organic Treatment option for low-level mixed waste. The latest respirable airborne release fraction values published by the US Department of Energy for use in accident analysis have been used and are included as Appendix D, where respirable airborne release fraction is defined as the fraction of material exposed to accident stresses that could become airborne as a result of the accident. A set of dominant waste treatment processes and accident scenarios was selected for a screening-process analysis. A subset of results (release source terms) from this analysis is presented.

  11. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    NASA Astrophysics Data System (ADS)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.

  12. Severe Accident Scoping Simulations of Accident Tolerant Fuel Concepts for BWRs

    SciTech Connect

    Robb, Kevin R.

    2015-08-01

    Accident-tolerant fuels (ATFs) are fuels and/or cladding that, in comparison with the standard uranium dioxide Zircaloy system, can tolerate loss of active cooling in the core for a considerably longer time period while maintaining or improving the fuel performance during normal operations [1]. It is important to note that the currently used uranium dioxide Zircaloy fuel system tolerates design basis accidents (and anticipated operational occurrences and normal operation) as prescribed by the US Nuclear Regulatory Commission. Previously, preliminary simulations of the plant response have been performed under a range of accident scenarios using various ATF cladding concepts and fully ceramic microencapsulated fuel. Design basis loss of coolant accidents (LOCAs) and station blackout (SBO) severe accidents were analyzed at Oak Ridge National Laboratory (ORNL) for boiling water reactors (BWRs) [2]. Researchers have investigated the effects of thermal conductivity on design basis accidents [3], investigated silicon carbide (SiC) cladding [4], as well as the effects of ATF concepts on the late stage accident progression [5]. These preliminary analyses were performed to provide initial insight into the possible improvements that ATF concepts could provide and to identify issues with respect to modeling ATF concepts. More recently, preliminary analyses for a range of ATF concepts have been evaluated internationally for LOCA and severe accident scenarios for the Chinese CPR1000 [6] and the South Korean OPR-1000 [7] pressurized water reactors (PWRs). In addition to these scoping studies, a common methodology and set of performance metrics were developed to compare and support prioritizing ATF concepts [8]. A proposed ATF concept is based on iron-chromium-aluminum alloys (FeCrAl) [9]. With respect to enhancing accident tolerance, FeCrAl alloys have substantially slower oxidation kinetics compared to the zirconium alloys typically employed. During a severe accident, FeCrAl would tend to generate heat and hydrogen from oxidation at a slower rate compared to the zirconium-based alloys in use today. The previous study, [2], of the FeCrAl ATF concept during station blackout (SBO) severe accident scenarios in BWRs was based on simulating short term SBO (STSBO), long term SBO (LTSBO), and modified SBO scenarios occurring in a BWR-4 reactor with MARK-I containment. The analysis indicated that FeCrAl had the potential to delay the onset of fuel failure by a few hours depending on the scenario, and it could delay lower head failure by several hours. The analysis demonstrated reduced in-vessel hydrogen production. However, the work was preliminary and was based on limited knowledge of material properties for FeCrAl. Limitations of the MELCOR code were identified for direct use in modeling ATF concepts. This effort used an older version of MELCOR (1.8.5). Since these analyses, the BWR model has been updated for use in MELCOR 1.8.6 [10], and more representative material properties for FeCrAl have been modeled. Sections 2 4 present updated analyses for the FeCrAl ATF concept response during severe accidents in a BWR. The purpose of the study is to estimate the potential gains afforded by the FeCrAl ATF concept during BWR SBO scenarios.

  13. Causality analysis in business performance measurement system using system dynamics methodology

    NASA Astrophysics Data System (ADS)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  14. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    SciTech Connect

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    2008-09-30

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAE by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.

  15. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are presented to show how surface wrinkle progress with increasing tension loads. Antenna reflector surface accuracies were found to be very much dependent on the type and size of the antenna, the reflector surface curvature, reflector membrane supports in terms of spacing of catenaries, as well as the amount of applied load.

  16. Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, D.; Huang, H.; Shen, S.; Bou-Zeid, E.

    2014-01-01

    The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident (March 2011) are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem) model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

  17. ADAPT (Analysis of Dynamic Accident Progression Trees) Beta Version 0.9

    Energy Science and Technology Software Center (ESTSC)

    2010-01-07

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified simulator. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which use explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system evolution along with stochastic modeling. When DET are used to model different aspects of Probabilistic Risk Assessment (PRA),more »all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specified times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at different times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination) and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worse case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  18. The definitive analysis of the Bendandi's methodology performed with a specific software

    NASA Astrophysics Data System (ADS)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  19. Systems Approaches to Animal Disease Surveillance and Resource Allocation: Methodological Frameworks for Behavioral Analysis

    PubMed Central

    Rich, Karl M.; Denwood, Matthew J.; Stott, Alistair W.; Mellor, Dominic J.; Reid, Stuart W. J.; Gunn, George J.

    2013-01-01

    While demands for animal disease surveillance systems are growing, there has been little applied research that has examined the interactions between resource allocation, cost-effectiveness, and behavioral considerations of actors throughout the livestock supply chain in a surveillance system context. These interactions are important as feedbacks between surveillance decisions and disease evolution may be modulated by their contextual drivers, influencing the cost-effectiveness of a given surveillance system. This paper identifies a number of key behavioral aspects involved in animal health surveillance systems and reviews some novel methodologies for their analysis. A generic framework for analysis is discussed, with exemplar results provided to demonstrate the utility of such an approach in guiding better disease control and surveillance decisions. PMID:24348922

  20. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    NASA Astrophysics Data System (ADS)

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

    2011-12-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  1. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    PubMed Central

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  2. Methodology for cost analysis of film-based and filmless portable chest systems

    NASA Astrophysics Data System (ADS)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  3. Assessment of intestinal absorption: A methodology based on stable isotope adminstration and proton activation analysis

    SciTech Connect

    Cantone, M.C.

    1994-12-31

    The interest in biokinetic studies is driven by problems related to the physiopathology of oligoelements, chemical elemental pollution and radioactive release in case of nuclear accidents. The application of stable isotopes as tracers in studies of trace elements in the area of nutritional and food science is particularly attractive and specifically if considering the investigations on the most radiosensitive age groups of the population and the repeated studies on healthy people for the assessment of the bioavailability of different compounds. A tracer method based on stable isotope administration, which combines the simultaneous use of two tracers and proton activation analysis is presented. A study aimed to obtain molybdenum biokinetic data in humans was performed. One tracer ({sup 96}Mo) was orally administered and another ({sup 95}Mo) was intravenously injected to two fasting volunteer subjects. Venous blood samples were withdrawn at different postinjection times. The concentration in plasma for both the isotopes was determined by measuring the intensities of the gamma-lines from the technetium radioisotopes produced via (p,n) reactions. In the adopted experimental conditions a minimum detectable concentration of 2 ng isotope/ml plasma was attained. The parameters describing molybdenum kinetics were obtained for the two individuals. Moreover, the investigation was repeated with different tracer amounts for one of the two subjects, in both fasting and non-fasting condition.

  4. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    NASA Technical Reports Server (NTRS)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.

  5. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information (sensitivity result) to reduce sampling number. (4) Allowing grid independence for scaled integral effect test (IET) simulation and real plant applications: (a) eliminate numerical uncertainty on scaling; (b) reduce experimental cost by allowing smaller scaled IET; (c) eliminate user effects. This paper will review the issues related to the current CSAU, introduce FSA, discuss a potential Q-PIRT process, and show simple examples to perform FSA. Finally, the general research direction and requirements to use FSA in a system analysis code will be discussed.

  6. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    SciTech Connect

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  7. Theoretical and experimental study on heat transfer involving solidification and meltdown, for application in the analysis of reactor meltdown accidents

    SciTech Connect

    Pessanha, J.A.O.

    1988-01-01

    This work is devoted to the analysis of two phenomena dealing with heat transfer when change of phase is involved. The two phenomena being analyzed are related to nuclear reactor core meltdown accident analysis. Namely, the solidification of small droplets as they relocate over solid surfaces and the melting of solid walls due to the interaction with inclined impinging jets. The model describing the solidification of the droplets was obtained by combining a refined integral solution for the energy equation with the moving contact line solution (used to evaluate the droplet's velocity). The results of calculations performed using the model have been compared against experimental data. Thus comparison proved that the model was able to accurately describe the freezing process of the small droplets. The model has been implemented in the APRIL.MOD2 computer code. The model describing the melting of solid walls due to the heat transferred from inclined impinging jets was obtained by solving the momentum and energy equations for the jet as well as the molten layer. The small thickness approximation was used in order to obtain an analytical solution. Results form this model have also been compared against experimental data proving the correctness of the present modeling concept.

  8. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    SciTech Connect

    1999-08-31

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity analysis. These steps are described below.

  9. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  10. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Astrophysics Data System (ADS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1995-05-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modelling strategy. The structural response for each cracked configuration is obtained using a geometrically non-linear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology, and its applicability to performing practical analyses of realistic structures, is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  11. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Astrophysics Data System (ADS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-09-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  12. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    NASA Astrophysics Data System (ADS)

    Rovira-Garcia, A.; Juan, J. M.; Sanz, J.; González-Casado, G.; Ibáñez, D.

    2015-10-01

    The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences are separated as hardware delays (a receiver constant plus a satellite constant) per day. The present study was conducted for the entire year of 2014, i.e. during the last solar cycle maximum. The ionospheric models assessed are the operational models broadcast by the global positioning system (GPS) and Galileo constellations, the satellite-based augmentation system (SBAS) (i.e. European Geostationary Navigation Overlay System (EGNOS) and wide area augmentation system (WAAS)), a number of post-process global ionospheric maps (GIMs) from different International GNSS Service (IGS) analysis centres (ACs) and, finally, a more sophisticated GIM computed by the research group of Astronomy and GEomatics (gAGE). Ionospheric models based on GNSS data and represented on a grid (IGS GIMs or SBAS) correct about 85 % of the total slant ionospheric delay, whereas the models broadcasted in the navigation messages of GPS and Galileo only account for about 70 %. Our gAGE GIM is shown to correct 95 % of the delay. The proposed methodology appears to be a useful tool to improve current ionospheric models.

  13. The U-tube sampling methodology and real-time analysis of geofluids

    SciTech Connect

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-03-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood [1973], provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO{sub 2} storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO{sub 2} from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO{sub 2} storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  14. Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures

    NASA Technical Reports Server (NTRS)

    Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

    1994-01-01

    A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

  15. Injury Severity and Mortality of Adult Zebra Crosswalk and Non-Zebra Crosswalk Road Crossing Accidents: A Cross-Sectional Analysis

    PubMed Central

    Pfortmueller, Carmen A.; Marti, Mariana; Kunz, Mirco; Lindner, Gregor; Exadaktylos, Aristomenis K.

    2014-01-01

    Principals Over a million people worldwide die each year from road traffic injuries and more than 10 million sustain permanent disabilities. Many of these victims are pedestrians. The present retrospective study analyzes the severity and mortality of injuries suffered by adult pedestrians, depending on whether they used a zebra crosswalk. Methods Our retrospective data analysis covered adult patients admitted to our emergency department (ED) between 1 January 2000 and 31 December 2012 after being hit by a vehicle while crossing the road as a pedestrian. Patients were identified by using a string term. Medical, police and ambulance records were reviewed for data extraction. Results A total of 347 patients were eligible for study inclusion. Two hundred and three (203; 58.5%) patients were on a zebra crosswalk and 144 (41.5%) were not. The mean ISS (injury Severity Score) was 12.1 (SD 14.7, range 1-75). The vehicles were faster in non-zebra crosswalk accidents (47.7 km/n, versus 41.4 km/h, p<0.027). The mean ISS score was higher in patients with non-zebra crosswalk accidents; 14.4 (SD 16.5, range 1–75) versus 10.5 (SD13.14, range 1–75) (p<0.019). Zebra crosswalk accidents were associated with less risk of severe injury (OR 0.61, 95% CI 0.38–0.98, p<0.042). Accidents involving a truck were associated with increased risk of severe injury (OR 3.53, 95%CI 1.21–10.26, p<0.02). Conclusion Accidents on zebra crosswalks are more common than those not on zebra crosswalks. The injury severity of non-zebra crosswalk accidents is significantly higher than in patients with zebra crosswalk accidents. Accidents involving large vehicles are associated with increased risk of severe injury. Further prospective studies are needed, with detailed assessment of motor vehicle types and speed. PMID:24595100

  16. Vehicle technologies heavy vehicle program : FY 2008 benefits analysis, methodology and results --- final report.

    SciTech Connect

    Singh, M.; Energy Systems; TA Engineering

    2008-02-29

    This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the Vehicle Technologies (VT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, and (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 08 the Heavy Vehicles program continued its involvement with various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. These changes are the result of a planning effort that first occurred during FY 04 and was updated in the past year. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY08 Budget Request. The energy savings models are utilized by the VT program for internal project management purposes.

  17. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  18. Safety assessment methodology in management of spent sealed sources.

    PubMed

    Mahmoud, Narmine Salah

    2005-02-14

    Environmental hazards can be caused from radioactive waste after their disposal. It was therefore important that safety assessment methodologies be developed and established to study and estimate the possible hazards, and institute certain safety methodologies that lead and prevent the evolution of these hazards. Spent sealed sources are specific type of radioactive waste. According to IAEA definition, spent sealed sources are unused sources because of activity decay, damage, misuse, loss, or theft. Accidental exposure of humans from spent sealed sources can occur at the moment they become spent and before their disposal. Because of that reason, safety assessment methodologies were tailored to suit the management of spent sealed sources. To provide understanding and confidence of this study, validation analysis was undertaken by considering the scenario of an accident that occurred in Egypt, June 2000 (the Meet-Halfa accident from an iridium-192 source). The text of this work includes consideration related to the safety assessment approaches of spent sealed sources which constitutes assessment context, processes leading an active source to be spent, accident scenarios, mathematical models for dose calculations, and radiological consequences and regulatory criteria. The text also includes a validation study, which was carried out by evaluating a theoretical scenario compared to the real scenario of Meet-Halfa accident depending on the clinical assessment of affected individuals. PMID:15721523

  19. Probabilistic Climate Forecasting: Methodological issues arising from analysis in climateprediction.net

    NASA Astrophysics Data System (ADS)

    Rowlands, D. J.; Frame, D. J.; Meinshausen, N.; Aina, T.; Jewson, S.; Allen, M. R.

    2009-12-01

    One of the chief goals of climate research is to produce meaningful probabilistic forecasts that can be used in the formation of future policy and adaptation strategies. The current range of methodologies presented in the scientific literature show that this is not an easy task, especially with the various philosophical interpretations of how to combine the information contained in Perturbed-Physics and Multi-Model Ensembles (PPE & MME). The focus of this research is to present some of the methodological issues that have arisen in the statistical analysis of the latest climateprediciton.net experiment, a large PPE of transient simulations using HadCM3L. Firstly we consider model evaluation and propose a method for calculating the Likelihood of each ensemble member based on a transient constraint involving regional temperature changes. We argue that this approach is more meaningful for future climate change projections than climatology based constraints. A further question we consider is which observations to include in our Likelihood function; should we care how well a model performs simulating the climate of Europe if we are producing a forecast for South Africa? The second issue deals with how to combine multiple models from such an ensemble together into a probabilistic forecast. Much has been said about the Bayesian methodology given the sensitivity of forecasts to prior assumptions. For simple models of the climate, with inputs such as climate sensitivity, there may be strong prior information, but for complex climate models where parameters correspond to non-observable quantities this is not so straightforward, and so we may have no reason to believe that a parameter has a uniform distribution or an inverse uniform distribution. We therefore propose two competing methodologies for dealing with this problem, namely Likelihood profiling, and the Jeffreys' prior, which is an approach typically known as OBJECTIVE Bayesian Statistics, where the use of the word objective simply implies that the prior is generated using a rule, rather than from expert opinion. We present novel results using a simple climate model as an illustrative example, with a view to applying these techniques to the full climateprediction.net ensemble.

  20. Temporal Statistic of Traffic Accidents in Turkey

    NASA Astrophysics Data System (ADS)

    Erdogan, S.; Yalcin, M.; Yilmaz, M.; Korkmaz Takim, A.

    2015-10-01

    Traffic accidents form clusters in terms of geographic space and over time which themselves exhibit distinct spatial and temporal patterns. There is an imperative need to understand how, where and when traffic accidents occur in order to develop appropriate accident reduction strategies. An improved understanding of the location, time and reasons for traffic accidents makes a significant contribution to preventing them. Traffic accident occurrences have been extensively studied from different spatial and temporal points of view using a variety of methodological approaches. In literature, less research has been dedicated to the temporal patterns of traffic accidents. In this paper, the numbers of traffic accidents are normalized according to the traffic volume and the distribution and fluctuation of these accidents is examined in terms of Islamic time intervals. The daily activities and worship of Muslims are arranged according to these time intervals that are spaced fairly throughout the day according to the position of the sun. The Islamic time intervals are never been used before to identify the critical hour for traffic accidents in the world. The results show that the sunrise is the critical time that acts as a threshold in the rate of traffic accidents throughout Turkey in Islamic time intervals.

  1. The Fukushima accident was preventable.

    PubMed

    Synolakis, Costas; Kâno?lu, Utku

    2015-10-28

    The 11 March 2011 tsunami was probably the fourth largest in the past 100 years and killed over 15?000 people. The magnitude of the design tsunami triggering earthquake affecting this region of Japan had been grossly underestimated, and the tsunami hit the Fukushima Dai-ichi nuclear power plant (NPP), causing the third most severe accident in an NPP ever. Interestingly, while the Onagawa NPP was also hit by a tsunami of approximately the same height as Dai-ichi, it survived the event 'remarkably undamaged'. We explain what has been referred to as the cascade of engineering and regulatory failures that led to the Fukushima disaster. One, insufficient attention had been given to evidence of large tsunamis inundating the region earlier, to Japanese research suggestive that large earthquakes could occur anywhere along a subduction zone, and to new research on mega-thrusts since Boxing Day 2004. Two, there were unexplainably different design conditions for NPPs at close distances from each other. Three, the hazard analysis to calculate the maximum probable tsunami at Dai-ichi appeared to have had methodological mistakes, which almost nobody experienced in tsunami engineering would have made. Four, there were substantial inadequacies in the Japan nuclear regulatory structure. The Fukushima accident was preventable, if international best practices and standards had been followed, if there had been international reviews, and had common sense prevailed in the interpretation of pre-existing geological and hydrodynamic findings. Formal standards are needed for evaluating the tsunami vulnerability of NPPs, for specific training of engineers and scientists who perform tsunami computations for emergency preparedness or critical facilities, as well as for regulators who review safety studies. PMID:26392611

  2. A fuzzy logic methodology for fault-tree analysis in critical safety systems

    SciTech Connect

    Erbay, A.; Ikonomopoulos, A. )

    1993-01-01

    A new approach for fault-tree analysis in critical safety systems employing fuzzy sets for information representation is presented in this paper. The methodology is based on the utilization of the extension principle for mapping crisp measurements to various degrees of membership in the fuzzy set of linguistic Truth. Criticality alarm systems are used in miscellaneous nuclear fuel processing, handling, and storage facilities to reduce the risk associated with fissile material operations. Fault-tree methodologies are graphic illustrations of tile failure logic associated with the development of a particular system failure (top event) from basic subcomponent failures (primary events). The term event denotes a dynamic change of state that occurs to system elements, which may include hardware, software, human, or environmental factors. A fault-tree represents a detailed, deductive, analysis that requires extensive system information. The knowledge incorporated in a fault tree can be articulated in logical rules of the form [open quotes]IF A is true THEN B is true.[close quotes] However, it is well known that this type of syllogism fails to give an answer when the satisfaction of the antecedent clause is only partial. Zadeh suggested a new type of fuzzy conditional inference. This type of syllogism (generalized modus ponens) reads as follows: Premise: A is partially true Implication: IF A is true THEN B is true Conclusion: B is partially-true. In generalized modus ponens, the antecedent is true only to some degree; hence, it is desired to compute the grade to which the consequent is satisfied. Fuzzy sets provide a natural environment for this type of computation because fuzzy variables (e.g., B) can take fuzzy values (e.g., partially-true).

  3. Kinetics Parameters of VVER-1000 Core with 3 MOX Lead Test Assemblies To Be Used for Accident Analysis Codes

    SciTech Connect

    Pavlovitchev, A.M.

    2000-03-08

    The present work is a part of Joint U.S./Russian Project with Weapons-Grade Plutonium Disposition in VVER Reactor and presents the neutronics calculations of kinetics parameters of VVER-1000 core with 3 introduced MOX LTAs. MOX LTA design has been studied in [1] for two options of MOX LTA: 100% plutonium and of ''island'' type. As a result, zoning i.e. fissile plutonium enrichments in different plutonium zones, has been defined. VVER-1000 core with 3 introduced MOX LTAs of chosen design has been calculated in [2]. In present work, the neutronics data for transient analysis codes (RELAP [3]) has been obtained using the codes chain of RRC ''Kurchatov Institute'' [5] that is to be used for exploitation neutronics calculations of VVER. Nowadays the 3D assembly-by-assembly code BIPR-7A and 2D pin-by-pin code PERMAK-A, both with the neutronics constants prepared by the cell code TVS-M, are the base elements of this chain. It should be reminded that in [6] TVS-M was used only for the constants calculations of MOX FAs. In current calculations the code TVS-M has been used both for UOX and MOX fuel constants. Besides, the volume of presented information has been increased and additional explications have been included. The results for the reference uranium core [4] are presented in Chapter 2. The results for the core with 3 MOX LTAs are presented in Chapter 3. The conservatism that is connected with neutronics parameters and that must be taken into account during transient analysis calculations, is discussed in Chapter 4. The conservative parameters values are considered to be used in 1-point core kinetics models of accident analysis codes.

  4. Organizational factors influencing serious occupational accidents.

    PubMed

    Salminen, S; Saari, J; Saarela, K L; Räsänen, T

    1993-10-01

    The aim of this article is to examine organizational factors influencing serious occupational accidents. The study was part of a larger project investigating 99 serious occupational accidents in southern Finland. A workplace analysis and an accident analysis were done at accident sites. In connection with this investigation, 73 victims, 91 foremen, and 83 co-workers were interviewed with a structured questionnaire. The results showed that the need to save time, tight schedules, and a lack of caution had a greater influence on accidents than did the foremen, co-workers, customers, professional pride, curiosity, or the wage system. Big companies had the lowest risk of serious occupational accidents. Accident risk was significantly greater for subcontractors than for main contractors. PMID:8296185

  5. Analysis of 129I in the soils of Fukushima Prefecture: preliminary reconstruction of 131I deposition related to the accident at Fukushima Daiichi Nuclear Power Plant (FDNPP).

    PubMed

    Muramatsu, Yasuyuki; Matsuzaki, Hiroyuki; Toyama, Chiaki; Ohno, Takeshi

    2015-01-01

    Iodine-131 is one of the most critical radionuclides to be monitored after release from reactor accidents due to the tendency for this nuclide to accumulate in the human thyroid gland. However, there are not enough data related to the reactor accident in Fukushima, Japan to provide regional information on the deposition of this short-lived nuclide (half-life = 8.02 d). In this study we have focused on the long-lived iodine isotope, (129)I (half-life of 1.57 × 10(7) y), and analyzed it by accelerator mass spectrometry (AMS) for surface soil samples collected at various locations in Fukushima Prefecture. In order to obtain information on the (131)I/(129)I ratio released from the accident, we have determined (129)I concentrations in 82 soil samples in which (131)I concentrations were previously determined. There was a strong correlation (R(2) = 0.84) between the two nuclides, suggesting that the (131)I levels in soil samples following the accident can be estimated through the analysis of (129)I. We have also examined the possible influence from (129m)Te on (129)I, and found no significant effect. In order to construct a deposition map of (131)I, we determined the (129)I concentrations (Bq/kg) in 388 soil samples collected from different locations in Fukushima Prefecture and the deposition densities (Bq/m(2)) of (131)I were reconstructed from the results. PMID:24930438

  6. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. PMID:26360211

  7. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    SciTech Connect

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.

  8. [Failure analysis of total knee replacement. Basics and methodological aspects of the damage analysis].

    PubMed

    Bader, R; Mittelmeier, W; Steinhauser, E

    2006-09-01

    Possible causes for failure of total knee endoprostheses represent wear, malpositioning, maldimensioning and inadequate design of the implant components, manufacturing defects, material fatigue, corrosion, overloading, infection, and allergy against implant materials. There is a broad spectrum of methodical approaches for the analysis of failure cases. Substantial information for the damage analysis is provided by clinical and intraoperative findings, photo documentation, radiographic course as well as all-solid, physical and histological investigations. Principal purposes of damage analysis are the avoidance of further damage events and the gain of information for improvement of implant design and material as well as the optimisation of the biocompatibility of implants and wear products. Both a detection system of incidents and implant failures as well as a complete data collection enables early identification of system-specific, accumulated cases of implant failure. PMID:16773388

  9. Preliminary Accident Analysis for Construction and Operation of the Chornobyl New Safety Confinement

    SciTech Connect

    Batiy, Valeriy; Rubezhansky, Yruiy; Rudko, Vladimir; shcherbin, vladimir; Yegorov, V; Schmieman, Eric A.; Timmins, Douglas C.

    2005-08-08

    Analysis of potential exposure of personal and population during construction and exploitation of the New Safe Confinement was made. Scenarios of hazard event development were ranked. It is shown, that as a whole construction and exploitation of the NSC are in accordance with actual radiation safety norms of Ukraine.

  10. Rapid Sample Preparation Methodology for Plant N-Glycan Analysis Using Acid-Stable PNGase H(.).

    PubMed

    Du, Ya M; Xia, Tian; Gu, Xiao Q; Wang, Ting; Ma, Hong Y; Voglmeir, Josef; Liu, Li

    2015-12-01

    The quantification of potentially allergenic carbohydrate motifs of plant and insect glycoproteins is increasingly important in biotechnological and agricultural applications as a result of the use of insect cell-based expression systems and transgenic plants. The need to analyze N-glycan moieties in a highly parallel manner inspired us to develop a quick N-glycan analysis method based on a recently discovered bacterial protein N-glycanase (PNGase H(+)). In contrast to the traditionally used PNGase A, which is isolated from almond seeds and only releases N-glycans from proteolytically derived glycopeptides, the herein implemented PNGase H(+) allows for the release of N-glycans directly from the glycoprotein samples. Because PNGase H(+) is highly active under acidic conditions, the consecutive fluorescence labeling step using 2-aminobenzamide (2AB) can be directly performed in the same mixture used for the enzymatic deglycosylation step. All sample handling and incubation steps can be performed in less than 4 h and are compatible with microwell-plate sampling, without the need for tedious centrifugation, precipitation, or sample-transfer steps. The versatility of this methodology was evaluated by analyzing glycoproteins derived from various plant sources using ultra-performance liquid chromatography (UPLC) analysis and further demonstrated through the activity analysis of four PNGase H(+) mutant variants. PMID:26548339

  11. Orbit-determination performance of Doppler data for interplanetary cruise trajectories. Part 1: Error analysis methodology

    NASA Technical Reports Server (NTRS)

    Ulvestad, J. S.; Thurman, S. W.

    1992-01-01

    An error covariance analysis methodology is used to investigate different weighting schemes for two-way (coherent) Doppler data in the presence of transmission-media and observing-platform calibration errors. The analysis focuses on orbit-determination performance in the interplanetary cruise phase of deep-space missions. Analytical models for the Doppler observable and for transmission-media and observing-platform calibration errors are presented, drawn primarily from previous work. Previously published analytical models were improved upon by the following: (1) considering the effects of errors in the calibration of radio signal propagation through the troposphere and ionosphere as well as station-location errors; (2) modelling the spacecraft state transition matrix using a more accurate piecewise-linear approximation to represent the evolution of the spacecraft trajectory; and (3) incorporating Doppler data weighting functions that are functions of elevation angle, which reduce the sensitivity of the estimated spacecraft trajectory to troposphere and ionosphere calibration errors. The analysis is motivated by the need to develop suitable weighting functions for two-way Doppler data acquired at 8.4 GHz (X-band) and 32 GHz (Ka-band). This weighting is likely to be different from that in the weighting functions currently in use; the current functions were constructed originally for use with 2.3 GHz (S-band) Doppler data, which are affected much more strongly by the ionosphere than are the higher frequency data.

  12. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the consequence assessment for the solution to the problem statement. For the four breach methodologies, a sensitivity analysis of four breach parameters, breach side slope (SS), breach width (Wb), breach invert elevation (Elb), and time of failure (tf), is conducted. Up to, 68 simulations are computed to produce breach hydrographs in HEC-RAS for input into Flood2D-GPU. The Flood2D-GPU simulation results were then post-processed in HEC-FIA to evaluate: Total Population at Risk (PAR), 14-yr and Under PAR (PAR14-), 65-yr and Over PAR (PAR65+), Loss of Life (LOL) and Direct Economic Impact (DEI). The MLM approach resulted in wide variability in simulated minimum and maximum values of PAR, PAR 65+ and LOL estimates. For PAR14- and DEI, Froehlich (1995) resulted in lower values while MLM resulted in higher estimates. This preliminary study demonstrated the relative performance of four commonly used dam breach methodologies and their impacts on consequence estimation.

  13. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1992-01-01

    Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.

  14. Analysis of electrical accidents and the related causes involving citizens who are served by the Western of Tehran

    PubMed Central

    Kalte, Haji Omid; Hosseini, Alireza Haji; Arabzadeh, Sara; Najafi, Hossein; Dehghan, Naser; Akbarzadeh, Arash; Keshavarz, Safiyeh; Karchani, Mohsen

    2014-01-01

    Background: Electrical burns account for a significant percentage of fatal accidents. Each year, a number of consumers in Iran suffer from electrical injuries due to technical problems, equipment failures, and the unauthorized use of electricity. The aim of this study was to examine the root causes of accidents that involved electricity in the district served by the Western Tehran Province Electricity Distribution Company. Methods: This was a descriptive study in which incidents involving electricity-related injuries were investigated among customers served by the Western Tehran Province Electricity Distribution Company. Therefore, we collected and analyzed incident reports filed by citizens from 2005 through the first half of 2009 in the Distribution Company’s coverage area, including Savejbolagh, Shahriyar, eastern Karaj, Qods City, southern Karaj, western Karaj, Malard, and Mehrshahr. The reported events were analyzed using SPSS software. Results: Exposure of electricity lines and unauthorized construction of residential houses in areas where there were medium- and low-voltage lines were responsible for 37% of the injuries. The findings showed that the highest rate of accidents occurred in 2008 and the first half of 2009. The highest rate of accidents occurred among people with a mean age of 35. Conclusion: The results from investigating the causes of electrical accidents emphasized the necessity of developing a culture of safety in communities, especially among employees who are engaged in occupations related to electricity, construction workers, and school children to reduce the rate of such accidents. PMID:25763153

  15. Analysis of fission product revaporization in a BWR reactor cooling system during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This report presents a preliminary analysis of fission product revaporization in the Reactor Cooling System (RCS) after the vessel failure. The station blackout transient for BWR Mark I Power Plant is considered. The TRAPMELT3 models of evaporization, chemisorption, and the decay heating of RCS structures and gases are adopted in the analysis. The RCS flow models based on the density-difference between the RCS and containment pedestal region are developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP is developed for the analysis. The REVAP is incorporated with the MARCH, TRAPMELT3 and NAUA codes of the Source Term Code Pack Package (STCP). The NAUA code is used to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors determining the magnitude of revaporization and subsequent release of the volatile fission product. 8 figs., 1 tab.

  16. Assessment of ISLOCA risk: Methodology and application to a Westinghouse four-loop ice condenser plant

    SciTech Connect

    Kelly, D.L.; Auflick, J.L.; Haney, L.N.

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISLOCA core damage frequency and risk. This report presents a detailed description of the application of this analysis methodology to a Westinghouse four-loop ice condenser plant. This document also includes appendices A through I which provide: System descriptions; ISLOCA event trees; human reliability analysis; thermal hydraulic analysis; core uncovery timing calculations; calculation of system rupture probability; ISLOCA consequences analysis; uncertainty analysis; and component failure analysis.

  17. A working man`s analysis of incidents and accidents with explosives at the Los Alamos National Laboratory, 1946--1997

    SciTech Connect

    Ramsay, J.B.; Goldie, R.H.

    1998-12-31

    At the inception of the Laboratory hectic and intense work was the norm during the development of the atomic bombs. After the war the development of other weapons for the Cold War again contributed to an intense work environment. Formal Standard Operating Procedures (SOPs) were not required at that time. However, the occurrence of six fatalities in 1959 during the development of a new high-energy plastic bonded explosive (94% HMX) forced the introduction SOPs. After an accident at the Department of Energy (DOE) plant at Amarillo, TX in 1977, the DOE promulgated the Department wide DOE Explosives Safety Manual. Table 1 outlines the history of the introduction of SOPs and the DOE Explosives Safety Manual. Many of the rules and guidelines presented in these documents were developed and introduced as the result of an incident or accident. However, many of the current staff are not familiar with the background of the development. To preserve as much of this knowledge as possible, they are collecting documentation on incidents and accidents involving energetic materials at Los Alamos. Formal investigations of serious accidents elucidate the multiple causes that contributed to accidents. These reports are generally buried in a file and, and are not read by more recent workers. Reports involving fatalities at Los Alamos before 1974 were withheld from the general employee. Also, these documents contain much detail and analysis that is not of interest to the field worker. The authors have collected the documents describing 116 incidents and have analyzed the contributing factors as viewed from the standpoint of the individual operator. All the incidents occurred at the Los Alamos National Laboratory and involved energetic materials in some manner, though not all occurred within the explosive handling groups. Most accidents are caused by multiple contributing factors. They have attempted to select the one or two factors that they consider as the most important relative to the individual doing the work. The value of SOPs was an obvious conclusion apriori. The introduction and use of SOPs reduced the probability of serious accidents. The second conclusion was less obvious in that it appears that the SOP did not adequately provide all the controls necessary for 16% of the events. Violations of SOPs, always considered as a potential contributor, was assigned as the major contributor in only 10 incidents.

  18. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    NASA Technical Reports Server (NTRS)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  19. Combined molecular algorithms for the generation, equilibration and topological analysis of entangled polymers: methodology and performance.

    PubMed

    Karayiannis, Nikos Ch; Kröger, Martin

    2009-11-01

    We review the methodology, algorithmic implementation and performance characteristics of a hierarchical modeling scheme for the generation, equilibration and topological analysis of polymer systems at various levels of molecular description: from atomistic polyethylene samples to random packings of freely-jointed chains of tangent hard spheres of uniform size. Our analysis focuses on hitherto less discussed algorithmic details of the implementation of both, the Monte Carlo (MC) procedure for the system generation and equilibration, and a postprocessing step, where we identify the underlying topological structure of the simulated systems in the form of primitive paths. In order to demonstrate our arguments, we study how molecular length and packing density (volume fraction) affect the performance of the MC scheme built around chain-connectivity altering moves. In parallel, we quantify the effect of finite system size, of polydispersity, and of the definition of the number of entanglements (and related entanglement molecular weight) on the results about the primitive path network. Along these lines we approve main concepts which had been previously proposed in the literature. PMID:20087477

  20. Survival analysis of cancer patients with multiple endpoints using global score test methodology

    NASA Astrophysics Data System (ADS)

    Zain, Zakiyah; Whitehead, John

    2014-06-01

    Progression-free survival (PFS), time-to-progression (TTP) and overall survival (OS) are examples of multiple endpoints commonly used in clinical trials of cancer patients. PFS is increasingly used as a primary endpoint in evaluation of patients with solid tumors, while multiple endpoints are often analysed independently. These endpoints are indeed correlated and it is desirable to evaluate effectiveness of treatments by means of a single parameter. In this paper, a single overall treatment effect is provided by combining the univariate score statistics for comparing treatments with respect to each survival endpoint. This global score test methodology was applied in analysis of 330 patients with an aggressive cancer, each with two endpoints recorded, T1 and T2, relating to disease progression and death respectively. The values of score statistics obtained from the proposed method matched closely those from the logrank test. Meanwhile, the correlations between the two score test statistics were found to be similar to those computed using the established Wei, Lin and Weissfeld method. Simulations further confirmed the consistent performance of this new method in analysis of bivariate survival data.

  1. Radiation accidents.

    PubMed

    Saenger, E L

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity. PMID:3526994

  2. A probabilistic analysis of a catastrophic transuranic waste hoist accident at the WIPP

    SciTech Connect

    Greenfield, M.A. |; Sargent, T.J. |

    1993-06-01

    This report builds upon the extensive and careful analyses made by the DOE of the probability of failure of the waste hoist, and more particularly on the probability of failure of a major component, the hydraulic brake system. The extensive fault tree analysis prepared by the DOE was the starting point of the present report. A key element of this work is the use of probability distributions rather than so-called point estimates to describe the probability of failure of an element. One of the authors (MAG) developed the expressions for the probability of failure of the brake system. The second author (TJS) executed the calculations of the final expressions for failure probabilities. The authors hope that this work will be of use to the DOE in its evaluation of the safety of the waste hoist, a key element at the WIPP.

  3. Analysis of avalanche risk factors in backcountry terrain based on usage frequency and accident data in Switzerland

    NASA Astrophysics Data System (ADS)

    Techel, F.; Zweifel, B.; Winkler, K.

    2015-09-01

    Recreational activities in snow-covered mountainous terrain in the backcountry account for the vast majority of avalanche accidents. Studies analyzing avalanche risk mostly rely on accident statistics without considering exposure (or the elements at risk), i.e., how many, when and where people are recreating, as data on recreational activity in the winter mountains are scarce. To fill this gap, we explored volunteered geographic information on two social media mountaineering websites - bergportal.ch and camptocamp.org. Based on these data, we present a spatiotemporal pattern of winter backcountry touring activity in the Swiss Alps and compare this with accident statistics. Geographically, activity was concentrated in Alpine regions relatively close to the main Swiss population centers in the west and north. In contrast, accidents occurred equally often in the less-frequented inner-alpine regions. Weekends, weather and avalanche conditions influenced the number of recreationists, while the odds to be involved in a severe avalanche accident did not depend on weekends or weather conditions. However, the likelihood of being involved in an accident increased with increasing avalanche danger level, but also with a more unfavorable snowpack containing persistent weak layers (also referred to as an old snow problem). In fact, the most critical situation for backcountry recreationists and professionals occurred on days and in regions when both the avalanche danger was critical and when the snowpack contained persistent weak layers. The frequently occurring geographical pattern of a more unfavorable snowpack structure also explains the relatively high proportion of accidents in the less-frequented inner-alpine regions. These results have practical implications: avalanche forecasters should clearly communicate the avalanche danger and the avalanche problem to the backcountry user, particularly if persistent weak layers are of concern. Professionals and recreationists, on the other hand, require the expertise to adjust the planning of a tour and their backcountry travel behavior depending on the avalanche danger and the avalanche problem.

  4. Factors Associated with Fatal Occupational Accidents among Mexican Workers: A National Analysis

    PubMed Central

    Gonzalez-Delgado, Mery; Gómez-Dantés, Héctor; Fernández-Niño, Julián Alfredo; Robles, Eduardo; Borja, Víctor H.; Aguilar, Miriam

    2015-01-01

    Objective To identify the factors associated with fatal occupational injuries in Mexico in 2012 among workers affiliated with the Mexican Social Security Institute. Methods Analysis of secondary data using information from the National Occupational Risk Information System, with the consequence of the occupational injury (fatal versus non-fatal) as the response variable. The analysis included 406,222 non-fatal and 1,140 fatal injuries from 2012. The factors associated with the lethality of the injury were identified using a logistic regression model with the Firth approach. Results Being male (OR=5.86; CI95%: 4.22-8.14), age (OR=1.04; CI95%: 1.03-1.06), employed in the position for 1 to 10 years (versus less than 1 year) (OR=1.37; CI95%: 1.15-1.63), working as a facilities or machine operator or assembler (OR: 3.28; CI95%: 2.12- 5.07) and being a worker without qualifications (OR=1.96; CI95%: 1.18-3.24) (versus an office worker) were associated with fatality in the event of an injury. Additionally, companies classified as maximum risk (OR=1.90; CI 95%: 1.38-2.62), workplace conditions (OR=7.15; CI95%: 3.63-14.10) and factors related to the work environment (OR=9.18; CI95%:4.36-19.33) were identified as risk factors for fatality in the event of an occupational injury. Conclusions Fatality in the event of an occupational injury is associated with factors related to sociodemographics (age, sex and occupation), the work environment and workplace conditions. Worker protection policies should be created for groups with a higher risk of fatal occupational injuries in Mexico. PMID:25790063

  5. Quantifying reactor safety margins: Application of CSAU (Code Scalability, Applicability and Uncertainty) methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    SciTech Connect

    Wulff, W.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Levy, S.; Rohatgi, U.S.; Wilson, G.E.; Zuber, N.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs.

  6. Input-output model for MACCS nuclear accident impacts estimation¹

    SciTech Connect

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    2015-01-27

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domestic product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.

  7. 3RD WP PROBABILISTIC CRITICALITY ANALYSIS: METHODOLOGY FOR BASKET DEGRADATION WITH APPLICANTION TO COMMERICAL SNF

    SciTech Connect

    P. Goulib

    1997-09-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development (WPD) department to describe the latest version of the probabilistic criticality analysis methodology and its application to the entire commercial waste stream of commercial pressurized water reactor (PWR) spent nuclear fuel (SNF) expected to be emplaced in the repository. The purpose of this particular application is to evaluate the 21 assembly PWR absorber plate waste package (WP) with respect to degraded mode criticality performance. The degradation of principal concern is the borated stainless steel absorber plates which are part of the waste package basket and which constitute a major part of the waste package criticality control. The degradation (corrosion, dissolution) of this material will result in the release of most of the boron from the waste package and increase the possibility of criticality. The results of this evaluation will be expressed in terms of the fraction of the PWR SNF which can exceed a given k{sub eff}, as a function of time and the peak value of that fraction over a time period up to several hundred thousand years. The ultimate purpose of this analysis is to support the waste package design which defines waste packages to cover a range of SNF characteristics. In particular, with respect to PWR criticality the current categories are: (1) no specific criticality control material, (2) borated stainless steel plates in the waste package basket, and (3) zirconium clad boron carbide control rods (Ref. 5.4). The results of this analysis will indicate the coverage provided by the first two categories. With these results, this study will provide the first quantitative estimate of the benefit expected from the control measure consisting of borated stainless steel plates. This document is the third waste package probabilistic criticality analysis. The first two (Ref. 5.12 for the first and Ref. 5.15 for the second) analyses were based primarily on the waste package criticality of the design basis fuel with a limited extension to SNF with other characteristics, and that only in terms of an infinite array (k{sub {infinity}} versus k{sub eff}) for the other fuel types. The previous analyses were also limited in the coverage of the range of possible input parameter values (material corrosion rates and the rate of water into the waste package). This analysis will show the sensitivity of criticality performance with respect to variations in the distribution of these parameters.

  8. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    NASA Astrophysics Data System (ADS)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth). To check/test the spatial proposal of the ground station site, this analysis methodology uses mission simulation software of spatial vehicles to analyze and quantify how the geographic accuracy of the position of the spatial vehicles along the horizon visible from the antenna, increases communication time with the ground station. Experimental results that have been obtained from a ground station located at ETSIT-UPM in Spain (QBito Nanosatellite, UPM spacecraft mission within the QB50 project) show that selection of the optimal site increases the field of view from the antenna and hence helps to meet mission requirements.

  9. In-Containment Thermal-hydraulic and Aerosol Behaviour during Severe Accidents: Analysis of the PHEBUS-FPT2 Experiment

    SciTech Connect

    Herranz, Luis E.; Fontanet, Joan; Vela-Garcia, Monica

    2006-07-01

    Ongoing work in the area of development and validation of severe accident computer codes, is and will be highly valuable when dealing with safety analysis of some designs of Generation III, III+ and, even, Generation IV. In the experiment PHEBUS-FPT2 a realistic source of nuclear aerosols was generated in the core and transported through a mock-up of the primary circuit up to a containment vessel where weak condensing conditions were imposed in a largely unsaturated atmosphere. By using CONTAIN 2.0, MELCOR 1.8.5 and ASTEC 1.1, the experimental scenario has been modeled. All the codes share similar characteristics and approached the experimental scenario in a quite simple way. The same assumptions have been made and the only major difference has been the three-cell nodalization of the vessel in the case of ASTEC 1.1 (a single cell was used in CONTAIN and MELCOR). No major code-to-code differences have stemmed from the different meshing schemes used in the vessel modeling. However, some minor differences have been observed between ASTEC and the American codes in variables like gas temperature or settled mass. The agreement of code estimates with available data can be said to be acceptable. Slight discrepancies found in steam partial pressure seem to indicate that codes over-estimated steam condensation rate during the first 2000 s. Potential uncertainties in surface temperature could well explain this. Overall evolution of airborne aerosols has been satisfactorily predicted. However, all the codes noticeably overestimate sedimentation. Sensitivity studies carried out on particles size, shape and density have indicated that uncertainties on those variables cannot justify the magnitude of the deviation found. (authors)

  10. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume I. Data analysis methodology and hardware description

    SciTech Connect

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and had dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV /sup 241/Pu and 208-keV /sup 237/U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings.

  11. CHEMICAL TRANSFORMATIONS IN ACID RAIN. VOLUME 1. NEW METHODOLOGIES FOR SAMPLING AND ANALYSIS OF GAS-PHASE PEROXIDE

    EPA Science Inventory

    New methodologies for the sampling and analysis of gas-phase peroxides (H2O2 and organic peroxides) using (a) diffusion denuder tubes and (b) gas-to-liquid transfer with prior removal of ozone have been investigated. The purpose was to develop an interference-free method for dete...

  12. How to Identify E-Learning Trends in Academic Teaching: Methodological Approaches and the Analysis of Scientific Discourses

    ERIC Educational Resources Information Center

    Fischer, Helge; Heise, Linda; Heinz, Matthias; Moebius, Kathrin; Koehler, Thomas

    2015-01-01

    Purpose: The purpose of this paper is to introduce methodology and findings of a trend study in the field of e-learning. The overall interest of the study was the analysis of scientific e-learning discourses. What comes next in the field of academic e-learning? Which e-learning trends dominate the discourse at universities? Answering such…

  13. JOURNAL OF THE INTERNATIONAL ASSOCIATION FOR SHELL AND SPATIAL STRUCTURES: J. IASS THRUST NETWORK ANALYSIS: A NEW METHODOLOGY FOR

    E-print Network

    Lygeros, John

    JOURNAL OF THE INTERNATIONAL ASSOCIATION FOR SHELL AND SPATIAL STRUCTURES: J. IASS 1 THRUST NETWORK ANALYSIS: A NEW METHODOLOGY FOR THREE-DIMENSIONAL EQUILIBRIUM PHILIPPE BLOCK1 AND JOHN OCHSENDORF2 1 Research Assistant, ph_block@mit.edu, 2 Associate Professor, jao@mit.edu, Building Technology Program

  14. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  15. A Classification System for 2-Year Postsecondary Institutions. Methodology Report. Postsecondary Education Descriptive Analysis Reports.

    ERIC Educational Resources Information Center

    Phipps, Ronald A.; Shedd, Jessica M.; Merisotis, Jamie P.

    This methodology report by the National Center for Education Statistics (NCES) outlines the need and rationale for a two-year postsecondary classification system and the methodology used to produce this classification system. The system was created based on information from the Integrated Postsecondary Education Data System (IPEDS) database that…

  16. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    SciTech Connect

    Zain, Zakiyah Ahmad, Yuhaniz; Azwan, Zairul E-mail: farhanaraduan@gmail.com Raduan, Farhana E-mail: farhanaraduan@gmail.com Sagap, Ismail E-mail: farhanaraduan@gmail.com; Aziz, Nazrina

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  17. A New Approximate Fracture Mechanics Analysis Methodology for Composites with a Crack or Hole

    NASA Technical Reports Server (NTRS)

    Tsai, H. C.; Arocho, A.

    1990-01-01

    A new approximate theory which links the inherent flaw concept with the theory of crack tip stress singularities at a bi-material interface was developed. Three assumptions were made: (1) the existence of inherent flaw (i.e., damage zone) at the tip of the crack, (2) a fracture of the filamentary composites initiates at a crack lying in the matrix material at the interface of the matrix/filament, and (3) the laminate fails whenever the principal load-carrying laminae fails. This third assumption implies that for a laminate consisting of 0 degree plies, cracks into matrix perpendicular to the 0 degree filaments are the triggering mechanism for the final failure. Based on this theory, a parameter bar K sub Q which is similar to the stress intensity factor for isotropic materials but with a different dimension was defined. Utilizing existing test data, it was found that bar K sub Q can be treated as a material constant. Based on this finding a fracture mechanics analysis methodology was developed. The analytical results are correlated well with test results. This new approximate theory can apply to both brittle and metal matrix composite laminates with crack or hole.

  18. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    NASA Astrophysics Data System (ADS)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  19. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  20. Scaling analysis of ocean surface turbulent heterogeneities from satellite remote sensing: a methodological study.

    NASA Astrophysics Data System (ADS)

    Pannimpullath Remanan, Renosh; Schmitt, Francois; Loisel, Hubert

    2015-04-01

    Satellite remote sensing observation allow the ocean surface to be sampled synoptically over large spatio-temporal scales. The images provided by ocean colour satellites are widely used in physical, biological and ecological oceanography. The present work proposes a method for understanding the multi-scaling properties of satellite ocean colour products such as Chlorophyll-a (Chl-a) Sea Surface Temperature (SST), rarely studied. The specific objectives of this study are to show how the small scale heterogeneities of satellite images can be characterized using tools borrowed from the fields of turbulence, and how these patterns are related to environmental conditions. For that purpose, we show how the structure function, which is classical for scaling time series analysis, can be used also in 2D. The main advantage of this method is that it can be used for images which have missing data. We show, using a simulation and two real images taken as examples that coarse-graining (CG) of a gradient modulus transform of the original image does not provide correct scaling exponents. We show, using a fractional Brownian simulation in 2D, that the structure function (SF) can be used with randomly sampled couple of points, and verify that 1 million of couple of points provides enough statistics. We illustrate this methodology using two satellite images chosen as examples.

  1. RADON AND PROGENY ALPHA-PARTICLE ENERGY ANALYSIS USING NUCLEAR TRACK METHODOLOGY

    SciTech Connect

    Espinosa Garcia, Guillermo; Golzarri y Moreno, Dr. Jose Ignacio; Bogard, James S

    2008-01-01

    A preliminary procedure for alpha energy analysis of radon and progeny using Nuclear Track Methodology (NTM) is described in this paper. The method is based on the relationship between alpha-particle energies deposited in polycarbonate material (CR-39) and the track size developed after a well-established chemical etching process. Track geometry, defined by parameters such as major or minor diameters, track area and overall track length, is shown to correlate with alpha-particle energy over the range 6.00 MeV (218Po) to 7.69 MeV (214Po). Track features are measured and the data analyzed automatically using a digital imaging system and commercial PC software. Examination of particle track diameters in CR-39 exposed to environmental radon reveals a multi-modal distribution. Locations of the maxima in this distribution are highly correlated with alpha particle energies of radon daughters, and the distributions are sufficiently resolved to identify the radioisotopes. This method can be useful for estimating the radiation dose from indoor exposure to radon and its progeny.

  2. A Methodology for the Analysis of Spontaneous Reactions in Automated Hearing Assessment.

    PubMed

    Fernandez, Alba; Ortega, Marcos; Gonzalez Penedo, Manuel; Vazquez, Covadonga; Gigirey, Luz

    2014-10-01

    Audiology is the science of hearing and auditory processes study. The evaluation of hearing capacity is commonly performed by an audiologist using an audiometer, where the patient is asked to show some kind of sign when he or she recognizes the stimulus. This evaluation becomes much more complicated when the patient suffers some type of cognitive decline that hinders the emission of visible signs of recognition. With this group of patients a typical question-answer interaction is not applicable, so the audiologist must focus his attention on the patient's spontaneous gestural reactions. This manual evaluation entails a number of problems: it is highly subjective, difficult to determine in real time (since the expert must pay attention simultaneously to the audiological process and the patient's reactions), etc. Considering this, in this paper, we present an automatic methodology for processing video sequences recorded during the performance of the hearing test in order to assist the audiologist in the detection of these spontaneous reactions. This screening method analyzes the movements that occur within the eye area, which has been pointed out by the audiologists as the most representative for these patients. By the analysis of these movements, the system helps the audiologist to determine when a positive gestural reaction has taken place increasing the objectivity and reproducibility. PMID:25296408

  3. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    NASA Astrophysics Data System (ADS)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz; Azwan, Zairul; Raduan, Farhana; Sagap, Ismail

    2014-12-01

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  4. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    SciTech Connect

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  5. Estimate of radionuclide release characteristics into containment under severe accident conditions. Final report

    SciTech Connect

    Nourbakhsh, H.P.

    1993-11-01

    A detailed review of the available light water reactor source term information is presented as a technical basis for development of updated source terms into the containment under severe accident conditions. Simplified estimates of radionuclide release and transport characteristics are specified for each unique combination of the reactor coolant and containment system combinations. A quantitative uncertainty analysis in the release to the containment using NUREG-1150 methodology is also presented.

  6. TU-C-BRE-09: Performance Comparisons of Patient Specific IMRT QA Methodologies Using ROC Analysis

    SciTech Connect

    McKenzie, E; Balter, P; Stingo, F; Followill, D; Kry, S; Jones, J

    2014-06-15

    Purpose: To evaluate the ability of a selection of patient-specific QA methods to accurately classify IMRT plans as acceptable or unacceptable based on a multiple ion chamber (MIC) phantom. Methods: Twenty-four IMRT plans were selected (20 previously failed the institutional QA), and were measured on a MIC phantom to assess their dosimetric acceptability. These same plans were then measured using film (Kodak EDR2) and ion chamber (Wellhofer cc04), ArcCheck (Sun Nuclear), and MapCheck (Sun Nuclear) (delivered AP field-by-field, AP composite, and with original gantry angles). All gamma analyses were performed at 2%/2mm, 3%/3mm, and 5%/3mm. By using the MIC results as a gold standard, the sensitivity and specificity were calculated across a range of cut-off thresholds (% pixels passing for gamma analysis, and % dose difference for ion chamber), and were used to form ROC curves. Area under the curve (AUC) was used as a metric to quantify the performance of the various QA methods. Results: Grouping device’s AUC’s revealed two statistically significant different groups: ion chamber (AUC of 0.94), AP composite MapCheck (AUC of 0.85), ArcCheck (AUC of 0.84), and film (AUC of 0.82) were in the better performing group, while original gantry angles and AP field-by-field MapCheck (AUC of 0.65 and 0.66, respectively) matched less well with the gold standard results. Optimal cut-offs were also assessed using the ROC curves. We found that while often 90% of pixels passing is used as a criteria, the differing sensitivities of QA methods can lead to device and methodology-based optimal cutoff thresholds. Conclusion: While many methods exist to perform the same task of patient-specific IMRT QA, they utilize different strategies. This work has shown that there are inconsistencies in these methodologies in terms of their sensitivity and specificity to dosimetric acceptability. This work was supported by Public Health Service grants CA010953, CA081647, and CA21661 awarded by the National Cancer Institute, United States Department of Health and Human Services.

  7. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    SciTech Connect

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits produced by technologies and practices supported by BT and by WIP. However, the approach is general enough for analysis of buildings-related technologies, independent of any specific program. An overview describes the GPRA process and the models used to estimate energy savings. The body of the document describes the algorithms used and the diffusion curve estimates.

  8. Highly resolved thermal analysis as a tool for soil organic carbon fractionation - methodological considerations

    NASA Astrophysics Data System (ADS)

    Heitkamp, Felix; Vuong, Xuan; Reimer, Andreas; Jungkunst, Hermann

    2015-04-01

    Organic carbon (OC) in environmental samples consists of a continuum of molecules with different chemistry and turnover. Thermal methods provide a useful tool to differentiate OC fractions according to their activation energies. The higher the temperature needed for combustion, the higher the activation energy and the lower the energy-gain for microorganisms in the decomposition process. However, until now there is no method, which is able to quantify organic carbon fractions as well as total, organic and inorganic carbon in one analytical run. Here, we present methodological tests regarding effects of (1) ramp speed (12 vs. 35°C), (2) introduction of temperature plateaus (hold) for better peak separation and (3) sample amount, all of which potentially affecting results of thermal analysis. The used machine is a MCD RC-412 (Leco corporation) with highly resolved IR detection of CO2 (3 times per second) during ramped combustion. Regression analysis of the two ramp speeds showed, that the outcome of anlysis was not affected. The intercept was not significantly different from 0 (0.14 ±3.15, p = 0.961) and the slope not significantly different from 1 (0.996 ±0.0094, p = 0.969). A ramp speed of 35oC min-1 is preferred because of decreased analysis time. Performing analytical runs with and without holds showed again, that the intercept was not significantly different from 0 (-1.40 ± 1.14, p = 0.232) and the slope did not differ significantly from 1 (1.081 ± 0.042, p =0.067). Inclusion of a ramp increases confidence in results due to better peak separation. However, this was only tested for a range of different soils and care should be taken to transfer results to other environmental media and should be tested specifically for soil types not tested, yet. The amount of sample had some effect, especially when using more than 20 mg sample. Thus, sample amount shoulb be kept low, which calls for excellent homogenization of sample material. Overall, the MCD RC-412 with the tested setup is a suitable alternative for soil carbon analysis with a higher amount of information, as compared to bulk carbon measurements.

  9. Application of the ARRAMIS Risk and Reliability Software to the Nuclear Accident Progression

    SciTech Connect

    Wyss, Gregory D.; Daniel, Sharon L.; Hays, Kelly M.; Brown, Thomas D.

    1997-06-01

    The ARRAMIS risk and reliability analysis software suite developed by Sandia National Laboratories enables analysts to evaluate the safety and reliability of a wide range of complex systems whose failure results in high consequences. This software was originally designed to model the systems, responses, and phenomena associated with potential severe accidents at commercial nuclear power reactors by solving very large fault tree and event tree models. However, because of its power and versatility, ARRAMIS and its constituent analysis engines have recently been used to evaluate a wide variety of systems, including nuclear weapons, telecommunications facilities, robotic material handling systems, and aircraft systems using hybrid fault tree event tree analysis techniques incorporating fully integrated uncertainty analysis capabilities. This paper describes recent applications in the area of nuclear reactor accident progression analysis using a large event tree methodology and the ARRAMIS package.

  10. A real-time, dynamic early-warning model based on uncertainty analysis and risk assessment for sudden water pollution accidents.

    PubMed

    Hou, Dibo; Ge, Xiaofan; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2014-01-01

    A real-time, dynamic, early-warning model (EP-risk model) is proposed to cope with sudden water quality pollution accidents affecting downstream areas with raw-water intakes (denoted as EPs). The EP-risk model outputs the risk level of water pollution at the EP by calculating the likelihood of pollution and evaluating the impact of pollution. A generalized form of the EP-risk model for river pollution accidents based on Monte Carlo simulation, the analytic hierarchy process (AHP) method, and the risk matrix method is proposed. The likelihood of water pollution at the EP is calculated by the Monte Carlo method, which is used for uncertainty analysis of pollutants' transport in rivers. The impact of water pollution at the EP is evaluated by expert knowledge and the results of Monte Carlo simulation based on the analytic hierarchy process. The final risk level of water pollution at the EP is determined by the risk matrix method. A case study of the proposed method is illustrated with a phenol spill accident in China. PMID:24781332

  11. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  12. Mitigative techniques and analysis of generic site conditions for ground-water contamination associated with severe accidents

    SciTech Connect

    Shafer, J.M.; Oberlander, P.L.; Skaggs, R.L.

    1984-04-01

    The purpose of this study is to evaluate the feasibility of using ground-water contaminant mitigation techniques to control radionuclide migration following a severe commercial nuclear power reactor accident. The two types of severe commercial reactor accidents investigated are: (1) containment basemat penetration of core melt debris which slowly cools and leaches radionuclides to the subsurface environment, and (2) containment basemat penetration of sump water without full penetration of the core mass. Six generic hydrogeologic site classifications are developed from an evaluation of reported data pertaining to the hydrogeologic properties of all existing and proposed commercial reactor sites. One-dimensional radionuclide transport analyses are conducted on each of the individual reactor sites to determine the generic characteristics of a radionuclide discharge to an accessible environment. Ground-water contaminant mitigation techniques that may be suitable, depending on specific site and accident conditions, for severe power plant accidents are identified and evaluated. Feasible mitigative techniques and associated constraints on feasibility are determined for each of the six hydrogeologic site classifications. The first of three case studies is conducted on a site located on the Texas Gulf Coastal Plain. Mitigative strategies are evaluated for their impact on contaminant transport and results show that the techniques evaluated significantly increased ground-water travel times. 31 references, 118 figures, 62 tables.

  13. Solar Reserve Methodology for Renewable Energy Integration Studies Based on Sub-Hourly Variability Analysis: Preprint

    SciTech Connect

    Ibanez, E.; Brinkman, G.; Hummon, M.; Lew, D.

    2012-08-01

    Increasing penetrations of wind a solar energy are raising concerns among electric system operators because of the variability and uncertainty associated with power sources. Previous work focused on the quantification of reserves for systems with wind power. This paper presents a new methodology that allows the determination of necessary reserves for high penetrations of photovoltaic (PV) power and compares it to the wind-based methodology. The solar reserve methodology is applied to Phase 2 of the Western Wind and Solar Integration Study. A summary of the results is included.

  14. Tsunami vulnerability analysis in the coastal town of Catania, Sicily: methodology and results

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Tinti, Stefano; Gallazzi, Sara; Tonini, Roberto; Zaniboni, Filippo

    2010-05-01

    Catania lies on the eastern coast of Sicily and is one of the most important towns in Sicily as regards history, tourism and industry. Recent analyses conducted in the frame of the project TRANSFER have shown that it is exposed not only to tsunamis generated locally, but also to distant tsunamis generated in the western Hellenic arc. In the frame of the European project SCHEMA different scenarios covering local sources such as the 11 January 1693 event and the 1908 case as well as remote sources such as the 365 AD tsunami have been explored through numerical modelling in order to assess the vulnerability of the area to tsunami attacks. One of the primary outcomes of the scenario analysis is the quantification of the inundation zones (location, extension along the coast and landward). Taking the modelling results on flooding as input data, the analysis has focussed on the geomorphological characteristics of the coasts and on the buildings and infrastructure typology to make evaluation of the vulnerability level of the Catania area. The coast to the south of the harbour of Catania is low and characterized by a mild slope: topography reaches the altitude of 10 m between 300-750 m distance from the shoreline. Building density is low, and generally tourist structures prevail on residential houses. The zone north of the harbour is high-coast, with 10 m isoline usually close to the coastline, and little possibility for flood to penetrate deep inland. Here there are three small marinas with the corresponding services and infrastructure around, and the city quarters consists of residential buildings. Vulnerability assessment has been carried out by following the methodology developed by the SCHEMA consortium, distinguishing between primary (type and material) and secondary criteria (e.g. ground, age, foundation, orientation, etc.) for buildings, and by adopting a building damage matrix, basically depending on building type and water inundation depth. Data needed for such analysis have been retrieved from satellite images such as Google and validated through ad hoc local surveys with the collaboration of the local civil protection agency.

  15. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential...

  16. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential...

  17. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential...

  18. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential...

  19. 10 CFR 76.85 - Assessment of accidents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...accidents. 76.85 Section 76.85 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) CERTIFICATION OF GASEOUS DIFFUSION PLANTS Safety § 76.85 Assessment of accidents. The Corporation shall perform an analysis of potential...

  20. Analysis of offsite dose calculation methodology for a nuclear power reactor

    SciTech Connect

    Moser, D.M.

    1995-12-31

    This technical study reviews the methodology for calculating offsite dose estimates as described in the offsite dose calculation manual (ODCM) for Pennsylvania Power and Light - Susquehanna Steam Electric Station (SSES). An evaluation of the SSES ODCM dose assessment methodology indicates that it conforms with methodology accepted by the US Nuclear Regulatory Commission (NRC). Using 1993 SSES effluent data, dose estimates are calculated according to SSES ODCM methodology and compared to the dose estimates calculated according to SSES ODCM and the computer model used to produce the reported 1993 dose estimates. The 1993 SSES dose estimates are based on the axioms of Publication 2 of the International Commission of Radiological Protection (ICRP). SSES Dose estimates based on the axioms of ICRP Publication 26 and 30 reveal the total body estimates to be the most affected.

  1. A methodology for stochastic analysis of share prices as Markov chains with finite states.

    PubMed

    Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey

    2014-01-01

    Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data. PMID:25520904

  2. Informational database methodology for urban risk analysis.Case study: the historic centre of Bucharest

    NASA Astrophysics Data System (ADS)

    Armas, I.; Dumitrascu, S.

    2009-04-01

    The urban environment often deals with issues concerning the deterioration of the constructed space and the quality of the environmental factors, in general terms meaning an unsatisfactory quality of life. Taking into account the complexity of the urban environment and the strong human impact, this ambience can be considered the ideal place for a varied range of risks to appear, being favoured by the external interventions and the dynamics of the internal changes that occur in the urban system, often unexpectedly. In this context, historic centre areas are even more vulnerable because of the age of the buildings and their socio-cultural value. The present study focuses on the development of a rapid assessment system of urban risks, putting emphasis on earthquakes. The importance of the study is shown by the high vulnerability that defines urban settlements, which can be considered socio-ecological systems characterized by a maximum risk level. In general, cities are highly susceptible areas because of their compactness and elevated degree of land occupancy, the Bucharest municipality being no exception. The street and sewerage networks disorganized the natural system resulted from the evolution of the lake-river system in Superior Pleistocene-Holocene and the intense construction activity represents a pressure that hasn't been measured and that is in need for a methodological interdisciplinary approach. In particular, the specific of Bucharest is given by the seismic risk based on an explosive urban evolution and the advanced state of degradation of the buildings. In this context, the Lipscani sector from the historic centre of the capital city is a maximum seismic vulnerability area, this being the result of its location in the Dâmbovita River meadow, on the brow and 80 m terrace, but more precisely because of the degradation of the buildings that cumulated the effects of the repeated earthquakes. The historic centre of Bucharest has not only a cultural function, but is also a very populated area, this being factors that favour a high susceptibility level. In addition, the majority of the buildings are included in the first and second categories of seismic risk, being built between 1875 and 1940, the age of the buildings establishing an increased vulnerability to natural hazards. The methodology was developed through the contribution of three partner universities from Bucharest: the University of Bucharest, the Academy for Economic Studies and the Technical University of Constructions. The method suggested was based on the analysis and processing of digital and statistical spatial information resulted from 1:500 topographical plans, satellite pictures, archives and historical maps used for the identification of the age of the buildings. Also, an important stage was represented by the field investigations that resulted with the data used in the assessment of the buildings: year of construction, location and vicinity, height, number of floors, state and function of the building, equipment and construction type. The information collected from the field together with the data resulted from the digitization of the ortophotoplans were inserted in ArcGIS in order to compile the database. Furthermore, the team from the Cybernetics Faculty developed a special software package in Visual Studio and SQL server in order to insert the sheets in GIS so that they could be statistically processed. The final product of the study is a program that includes as main functions editing, the analysis based on selected factors (individual or group) and viewing of building information in the shape of maps or 3D visualization. The strengths of the informational system resulted are given by the extended range of applicability, the short processing period, accessibility, capacity of support for a large amount of information and, thus, standing out as an adequate instrument to fit the needs of a susceptible population.

  3. A new methodology for fluorescence analysis of composite resins used in anterior direct restorations.

    PubMed

    de Lima, Liliane Motta; Abreu, Jessica Dantas; Cohen-Carneiro, Flavia; Regalado, Diego Ferreira; Pontes, Danielson Guedes

    2015-01-01

    The aim of this study was to use a new methodology to evaluate the fluorescence of composite resins for direct restorations. Microhybrid (group 1, Amelogen; group 2, Opallis; group 3, Filtek Z250) and nanohybrid (group 4, Filtek Z350 XT; group 5, Brilliant NG; group 6, Evolu-X) composite resins were analyzed in this study. A prefabricated matrix was used to prepare 60 specimens of 7.0 × 3.0 mm (n = 10 per group); the composite resin discs were prepared in 2 increments (1.5 mm each) and photocured for 20 seconds. To establish a control group of natural teeth, 10 maxillary central incisor crowns were horizontally sectioned to create 10 discs of dentin and enamel tissues with the same dimensions as the composite resin specimens. The specimens were placed in a box with ultraviolet light, and photographs were taken. Aperture 3.0 software was used to quantify the central portion of the image of each specimen in shades of red (R), green (G), and blue (B) of the RGB color space. The brighter the B shade in the evaluated area of the image, the greater the fluorescence shown by the specimen. One-way analysis of variance revealed significant differences between the groups. The fluorescence achieved in group 1 was statistically similar to that of the control group and significantly different from those of the other groups (Bonferroni test). Groups 3 and 4 had the lowest fluorescence values, which were significantly different from those of the other groups. According to the results of this study, neither the size nor the amount of inorganic particles in the evaluated composite resin materials predicts if the material will exhibit good fluorescence. PMID:26325645

  4. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    PubMed

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments). PMID:26298253

  5. Candu 6 severe core damage accident consequence analysis for steam generator tube rupture scenario using MAAP4-CANDU V4.0.5A: preliminary results

    SciTech Connect

    Petoukhov, S.M.; Awadh, B.; Mathew, P.M.

    2006-07-01

    This paper describes the preliminary results of the consequence analysis for a generic AECL CANDU 6 station, when it undergoes a postulated, low probability Steam Generator multiple Tube Rupture (SGTR) severe accident with assumed unavailability of several critical plant safety systems. The Modular Accident Analysis Program for CANDU (MAAP4-CANDU) code was used for this analysis. The SGTR accident is assumed to begin with the guillotine rupture of 10 steam generator tubes in one steam generator in Primary Heat Transport System (PHTS) loop 1. For the reference case, the following systems were assumed unavailable: moderator and shield cooling, emergency core cooling, crash cool-down, and main and auxiliary feed water. Two additional cases were analyzed, one with the crash cool-down system available, and another with the crash cool-down and the auxiliary feed water systems available. The three scenarios considered in this study show that most of the initial fission product inventory would be retained within the containment by various fission product retention mechanisms. For the case where the crash cool-down system was credited but the auxiliary feed water systems were not credited, the total mass of volatile fission products released to the environment including stable and radioactive isotopes was about four times more than in the reference case, because fission products could be released directly from the PHTS to the environment through the Main Steam Safety Valves (MSSVs), bypassing the containment. For the case where the crash cool-down and auxiliary feed water systems were credited, the volatile fission product release to the environment was insignificant, because the fission product release was substantially mitigated by scrubbing in the water pool in the secondary side of the steam generator (SG). (authors)

  6. Comprehensive default methodology for the analysis of exposures to mixtures of chemicals accidentally released to the atmosphere

    SciTech Connect

    Craig, D.K.; Baskett, R.L.; Powell, T.J.; Davis, J.S.; Dukes, L.L.; Hansen, D.J.; Petrocchi, A.J.; Sutherland, P.J.

    1997-07-01

    Safety analysis of Department of Energy (DOE) facilities requires consideration of potential exposures to mixtures of chemicals released to the atmosphere. Exposure to chemical mixtures may lead to additive, synergistic, or antagonistic health effects. In the past, the consequences of each chemical have been analyzed separately. This approach may not adequately protect the health of persons exposed to mixtures. However, considerable time would be required to evaluate all possible mixtures. The objective of this paper is to present reasonable default methodology developed by the EFCOG Safety Analysis Working Group Nonradiological Hazardous Material Subgroup (NHMS) for use in safety analysis within the DOE Complex.

  7. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  8. Evaluation of potential severe accidents during low power and shutdown operations at Surry, Unit 1: Analysis of core damage frequency from internal events during mid-loop operations. Appendix E (Sections E.9-E.16), Volume 2, Part 3B

    SciTech Connect

    Chu, T.L.; Musicki, Z.; Kohut, P.; Yang, J.; Bozoki, G.; Hsu, C.J.; Diamond, D.J.; Wong, S.M.; Bley, D.; Johnson, D.

    1994-06-01

    Traditionally, probabilistic risk assessments (PRA) of severe accidents in nuclear power plants have considered initiating events potentially occurring only during full power operation. Some previous screening analyses that were performed for other modes of operation suggested that risks during those modes were small relative to full power operation. However, more recent studies and operational experience have implied that accidents during low power and shutdown could be significant contributors to risk. Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the plants to be studied. The objectives of the program are to assess the risks of severe accidents initiated during plant operational states other than full power operation and to compare the estimated core damage frequencies, important accident sequences and other qualitative and quantitative results with those accidents initiated during full power operation as assessed in NUREG-1150. The scope of the program includes that of a level-3 PRA. In phase 2, mid-loop operation was selected as the plant configuration to be analyzed based on the results of the phase 1 study. The objective of the phase 2 study is to perform a detailed analysis of the potential accident scenarios that may occur during mid-loop operation, and compare the results with those of NUREG-1150. The scope of the level-1 study includes plant damage state analysis, and uncertainty analysis. Volume 1 summarizes the results of the study. Internal events analysis is documented in Volume 2. It also contains an appendix that documents the part of the phase 1 study that has to do with POSs other than mid-loop operation. Internal fire and internal flood analyses are documented in Volumes 3 and 4. A separate study on seismic analysis, documented in Volume 5, was performed for the NRC by Future Resources Associates, Inc. Volume 6 documents the accident progression, source terms, and consequence analysis.

  9. Evaluation of potential severe accidents during low power and shutdown operations at Surry, Unit-1: Analysis of core damage frequency from internal events during mid-loop operations. Appendices F-H, Volume 2, Part 4

    SciTech Connect

    Chu, T.L.; Musicki, Z.; Kohut, P.; Yang, J.; Bozoki, G.; Hsu, C.J.; Diamond, D.J.; Bley, D.; Johnson, D.; Holmes, B.

    1994-06-01

    Traditionally, probabilistic risk assessments (PRA) of severe accidents in nuclear power plants have considered initiating events potentially occurring only during full power operation. Some previous screening analyses that were performed for other modes of operation suggested that risks during those modes were small relative to full power operation. However, more recent studies and operational experience have implied that accidents during low power and shutdown could be significant contributors to risk. Two plants, Surry (pressurized water reactor) and Grand Gulf (boiling water reactor), were selected as the plants to be studied. The objectives of the program are to assess the risks of severe accidents initiated during plant operational states other than full power operation and to compare the estimated core damage frequencies, important accident sequences and other qualitative and quantitative results with those accidents initiated during full power operation as assessed in NUREG-1150. The scope of the program includes that of a level-3 PRA. In phase 2, mid-loop operation was selected as the plant configuration to be analyzed based on the results of the phase 1 study. The objective of the phase 2 study is to perform a detailed analysis of the potential accident scenarios that may occur during mid-loop operation, and compare the results with those of NUREG-1150. The scope of the level-1 study includes plant damage state analysis, and uncertainty analysis. Volume 1 summarizes the results of the study. Internal events analysis is documented in Volume 2. It also contains an appendix that documents the part of the phase 1 study that has to do with POSs other than mid-loop operation. Internal fire and internal flood analyses are documented in Volumes 3 and 4. A separate study on seismic analysis, documented in Volume 5, was performed for the NRC by Future Resources Associates, Inc. Volume 6 documents the accident progression, source terms, and consequence analysis.

  10. Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.

    SciTech Connect

    LaChance, Jeffrey L.; Hansen, Clifford W.

    2010-09-01

    The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additional Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.

  11. Updating outdated predictive accident models.

    PubMed

    Wood, A G; Mountain, L J; Connors, R D; Maher, M J; Ropkins, K

    2013-06-01

    Reliable predictive accident models (PAMs) (also referred to as safety performance functions (SPFs)) are essential to design and maintain safe road networks however, ongoing changes in road and vehicle design coupled with road safety initiatives, mean that these models can quickly become dated. Unfortunately, because the fitting of sophisticated PAMs including a wide range of explanatory variables is not a trivial task, available models tend to be based on data collected many years ago and seem unlikely to give reliable estimates of current accidents. Large, expensive studies to produce new models are likely to be, at best, only a temporary solution. This paper thus seeks to develop a practical and efficient methodology to allow currently available PAMs to be updated to give unbiased estimates of accident frequencies at any point in time. Two principal issues are examined: the extent to which the temporal transferability of predictive accident models varies with model complexity; and the practicality and efficiency of two alternative updating strategies. The models used to illustrate these issues are the suites of models developed for rural dual and single carriageway roads in the UK. These are widely used in several software packages in spite of being based on data collected during the 1980s and early 1990s. It was found that increased model complexity by no means ensures better temporal transferability and that calibration of the models using a scale factor can be a practical alternative to fitting new models. PMID:23510788

  12. Launch vehicle accident assessment for Mars Exploration Rover missions

    NASA Technical Reports Server (NTRS)

    Yau, M.; Reinhart, L.; Guarro, S.

    2002-01-01

    This paper presents the methodology used in the launch and space vehicle portion of the nuclear risk assessment for the two Mars Exploration Rover (MER) missions, which includes the assessment of accident scenarios and associated probabilities.

  13. Challenges and methodology for safety analysis of a high-level waste tank with large periodic releases of flammable gas

    SciTech Connect

    Edwards, J.N.; Pasamehmetoglu, K.O.; White, J.R.; Stewart, C.W.

    1994-07-01

    Tank 241-SY-101, located at the Department of Energy Hanford Site, has periodically released up to 10,000 ft{sup 3} of flammable gas. This release has been one of the highest-priority DOE operational safety problems. The gases include hydrogen and ammonia (fuels) and nitrous oxide (oxidizer). There have been many opinions regarding the controlling mechanisms for these releases, but demonstrating an adequate understanding of the problem, selecting a mitigation methodology, and preparing the safety analysis have presented numerous new challenges. The mitigation method selected for the tank was to install a pump that would mix the tank contents and eliminate the sludge layer believed to be responsible for the gas retention and periodic releases. This report will describe the principal analysis methodologies used to prepare the safety assessment for the installation and operation of the pump, and because this activity has been completed, it will describe the results of pump operation.

  14. 77 FR 31600 - Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal Pell Grant, Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-29

    ...The Secretary announces the annual updates to the tables that will be used in the statutory ``Federal Need Analysis Methodology'' to determine a student's expected family contribution (EFC) for award year 2013-2014 for the student financial aid programs authorized under title IV of the Higher Education Act of 1965, as amended (HEA). An EFC is the amount that a student and his or her family may......

  15. A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis

    NASA Technical Reports Server (NTRS)

    Chamberlain, R. G.

    1979-01-01

    The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.

  16. An uncertainty analysis of the hydrogen source term for a station blackout accident in Sequoyah using MELCOR 1.8.5

    SciTech Connect

    Gauntt, Randall O.; Bixler, Nathan E.; Wagner, Kenneth Charles

    2014-03-01

    A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing the range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.

  17. Detection of uranium and chemical state analysis of individual radioactive microparticles emitted from the Fukushima nuclear accident using multiple synchrotron radiation X-ray analyses.

    PubMed

    Abe, Yoshinari; Iizawa, Yushin; Terada, Yasuko; Adachi, Kouji; Igarashi, Yasuhito; Nakai, Izumi

    2014-09-01

    Synchrotron radiation (SR) X-ray microbeam analyses revealed the detailed chemical nature of radioactive aerosol microparticles emitted during the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, resulting in better understanding of what occurred in the plant during the early stages of the accident. Three spherical microparticles (?2 ?m, diameter) containing radioactive Cs were found in aerosol samples collected on March 14th and 15th, 2011, in Tsukuba, 172 km southwest of the FDNPP. SR-?-X-ray fluorescence analysis detected the following 10 heavy elements in all three particles: Fe, Zn, Rb, Zr, Mo, Sn, Sb, Te, Cs, and Ba. In addition, U was found for the first time in two of the particles, further confirmed by U L-edge X-ray absorption near-edge structure (XANES) spectra, implying that U fuel and its fission products were contained in these particles along with radioactive Cs. These results strongly suggest that the FDNPP was damaged sufficiently to emit U fuel and fission products outside the containment vessel as aerosol particles. SR-?-XANES spectra of Fe, Zn, Mo, and Sn K-edges for the individual particles revealed that they were present at high oxidation states, i.e., Fe(3+), Zn(2+), Mo(6+), and Sn(4+) in the glass matrix, confirmed by SR-?-X-ray diffraction analysis. These radioactive materials in a glassy state may remain in the environment longer than those emitted as water-soluble radioactive Cs aerosol particles. PMID:25084242

  18. Technology Transfer and Utilization Methodology; Further Analysis of the Linker Concept.

    ERIC Educational Resources Information Center

    Jolly, James A.; Creighton, J. W.

    This study is based on a comparison of data from two independent studies of technology utilization and dissemination methodology that sought to identify the behavior characteristics of "linkers" and "stabilizers" and their relative existence within different groups of technical personnel. Hypothesis for this study is that the distribution of the…

  19. Studying Urban History through Oral History and Q Methodology: A Comparative Analysis.

    ERIC Educational Resources Information Center

    Jimenez, Rebecca S.

    Oral history and Q methodology (a social science technique designed to document objectively and numerically the reactions of individuals to selected issues) were used to investigate urban renewal in Waco, Texas. Nineteen persons directly involved in the city's relocation and rehabilitation projects granted interviews. From these oral histories, 70…

  20. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    ERIC Educational Resources Information Center

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…