These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Linguistic methodology for the analysis of aviation accidents  

NASA Technical Reports Server (NTRS)

A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

Goguen, J. A.; Linde, C.

1983-01-01

2

Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade  

SciTech Connect

The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

Gregg L. Sharp; R. T. McCracken

2003-06-01

3

Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade  

SciTech Connect

The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

Sharp, G.L.; McCracken, R.T.

2003-05-13

4

Risk Estimation Methodology for Launch Accidents.  

SciTech Connect

As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

2014-02-01

5

Applying STAMP in Accident Analysis  

NASA Technical Reports Server (NTRS)

Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

2003-01-01

6

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

1931-01-01

7

ECONOMIC ANALYSIS OF ACCIDENT LAW  

Microsoft Academic Search

Accident law is the body of legal rules governing the ability of victims of harm to sue and to collect payments from those who injured them. This paper contains the chapters on accident law from a general, forthcoming book, Foundations of Economic Analysis of Law (Harvard University Press, 2003). The analysis is first concerned (chapters 2-4) with the influence of

Steven Shavell

2003-01-01

8

A comprehensive methodology for the fitting of predictive accident models  

Microsoft Academic Search

Recent years have seen considerable progress in techniques for establishing relationships between accidents, flows and road or junction geometry. It is becoming increasingly recognized that the technique of generalized linear models (GLMs) offers the most appropriate and soundly-based approach for the analysis of these data. These models have been successfully used in the series of major junction accident studies carried

Michael J. Maher; Ian Summersgill

1996-01-01

9

Deterministic accident analysis for RBMK  

Microsoft Academic Search

Within the framework of an European Commission sponsored activity, an assessment of the deterministic safety technology of the ‘post-Chernobyl modernized’ Reactor Bolshoy Moshchnosty Kipyashiy (RBMK) has been completed. The accident analysis, limited to the area of Design Basis Accident, constituted the key subject for the study; events not including the primary circuit were not considered, as well as events originated

F. D’Auria; B. Gabaraev; S. Soloviev; O. Novoselsky; A. Moskalev; E. Uspuras; G. M. Galassi; C. Parisi; A. Petrov; V. Radkevich; L. Parafilo; D. Kryuchkov

2008-01-01

10

Accident Tolerant Fuel Analysis  

SciTech Connect

Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

2014-09-01

11

ACCIDENT TOLERANT FUEL ANALYSIS  

SciTech Connect

Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

Smith, Curtis [Idaho National Laboratory; Chichester, Heather [Idaho National Laboratory; Johns, Jesse [Texas A& M University; Teague, Melissa [Idaho National Laboratory; Tonks, Michael Idaho National Laboratory; Youngblood, Robert [Idaho National Laboratory

2014-09-01

12

Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group  

SciTech Connect

The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

Brereton, S.; Shinn, J. [Lawrence Livermore National Lab., CA (United States); Hesse, D [Battelle Columbus Labs., OH (United States); Kaninich, D. [Westinghouse Savannah River Co., Aiken, SC (United States); Lazaro, M. [Argonne National Lab., IL (United States); Mubayi, V. [Brookhaven National Lab., Upton, NY (United States)

1997-08-01

13

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

1929-01-01

14

Analysis of the Chernobyl accident  

SciTech Connect

The unfortunate accident at Chernobyl Unit 4 has indicated the strong feedback effects that coupled neutronics and thermal-hydraulics core have for some reactor designs. To better understand the accident and to provide an independent assessment of the accident scenario given by Soviet scientists, Westinghouse has attempted to model the Chernobyl RBMK reactor using its safety analysis system computer codes. Before the RBMK reactor could be modeled, a coarse, three-dimensional reactor kinetics model (FASTAR), which was under development at Westinghouse, was integrated into the Westinghouse small-break loss-of-coolant accident code, NOTRUMP. The FASTAR neutronic model uses a single neutron energy and six neutron precursor groups and includes reactivity feedback from the graphite moderator, fuel rods, and the coolant in both the fuel assemblies and the control channels. FASTAR was coupled directly to the NOTRUMP thermal-hydraulics calculation to provide the calculated core power history as a function of the reactor coolant conditions. The NOTRUMP code uses a five-equation two-phase flow thermal-hydraulic model with thermal nonequilibrium and a drift flux formulation. The analysis presented does support the Soviet explanation of the Chernobyl accident scenario as a plausible cause and shows the extreme coupling of the core neutronics and hydraulics for this reactor design.

Heck, C.L.; Hochreiter, L.E.; Huang, P.; Stolmar, A.

1987-01-01

15

Analysis of accidents during flashing operations  

E-print Network

. In this thesis, the relative impacts of flashing signal operation versus regular signal operation were evaluated in several cities and towns in the State of Texas. Analysis were conducted to determine whether an increase in accidents and accident severity...

Obermeyer, Michael Edward

1993-01-01

16

Accident progression event tree analysis for postulated severe accidents at N Reactor  

SciTech Connect

A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. (Sandia National Labs., Albuquerque, NM (USA)); Medford, G.T. (Science Applications International Corp., Albuquerque, NM (USA))

1990-06-01

17

ACCIDENT ANALYSIS AND HAZARD ANALYSIS FOR HUMAN AND ORGANIZATIONAL FACTORS  

E-print Network

1 ACCIDENT ANALYSIS AND HAZARD ANALYSIS FOR HUMAN AND ORGANIZATIONAL FACTORS by MARGARET V #12;2 [Page intentionally left blank] #12;3 ACCIDENT ANALYSIS AND HAZARD ANALYSIS FOR HUMAN play in system safety during the development of these systems, accidents will occur. Safe design

Leveson, Nancy

18

START-UP ACCIDENT ANALYSIS  

Microsoft Academic Search

S>Analog computer studies are made of two main areas of interest during ; a start-up accident; the ability of drivedown circuitry to shut down the reactor ; before reaching the power range, and the self shut-down characteristics of the ; reactor if the accident progresses into the power range. The results indicate ; that period drivedown circuitry would have to

MacNaughton

1958-01-01

19

Aircraft Loss-of-Control Accident Analysis  

NASA Technical Reports Server (NTRS)

Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

Belcastro, Christine M.; Foster, John V.

2010-01-01

20

MELCOR analysis of the TMI-2 accident  

SciTech Connect

The MELCOR computer code has been used to analyze the first 174 minutes of the TMI-2 accident. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission for the purpose of analyzing severe accidents in nuclear power plants. Comparison of the code predictions to the available data shows that MELCOR is capable of modeling the key events of the TMI-2 accident and reasonable agreement with the available data is obtained. In particular, the core degradation and hydrogen generation models agree with best-estimate information available for this phase of the accident. While the code uses simplified modeling, all important characteristics of the reactor system and the accident phenomena could be modeled. This exercise demonstrates that MELCOR is applicable to severe accident analysis. 6 refs., 7 figs., 2 tabs.

Boucheron, E.A.; Kelly, J.E.

1988-01-01

21

A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences  

SciTech Connect

This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

Budnitz, R.J.; Lambert, H.E.; Apostolakis, G. [and others] and others

1998-04-01

22

A systems approach to food accident analysis  

E-print Network

Food borne illnesses lead to 3000 deaths per year in the United States. Some industries, such as aviation, have made great strides increasing safety through careful accident analysis leading to changes in industry practices. ...

Helferich, John D

2011-01-01

23

MELCOR analysis of the TMI-2 accident  

SciTech Connect

This paper describes the analysis of the Three Mile Island-2 (TMI-2) standard problem that was performed with MELCOR. The MELCOR computer code is being developed by Sandia National Laboratories for the Nuclear Regulatory Commission for the purpose of analyzing severe accident in nuclear power plants. The primary role of MELCOR is to provide realistic predictions of severe accident phenomena and the radiological source team. The analysis of the TMI-2 standard problem allowed for comparison of the model predictions in MELCOR to plant data and to the results of more mechanistic analyses. This exercise was, therefore valuable for verifying and assessing the models in the code. The major trends in the TMI-2 accident are reasonably well predicted with MELCOR, even with its simplified modeling. Comparison of the calculated and measured results is presented and, based on this comparison, conclusions can be drawn concerning the applicability of MELCOR to severe accident analysis. 5 refs., 10 figs., 3 tabs.

Boucheron, E.A.

1990-01-01

24

Upgrading the safety toolkit: Initiatives of the accident analysis subgroup  

SciTech Connect

Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45.

O'Kula, K.R.; Chung, D.Y.

1999-07-01

25

Learning from Accident Analysis: The Dynamics Leading Up to a Rafting Accident.  

ERIC Educational Resources Information Center

Analysis of a case study of a whitewater rafting accident reveals that such accidents tend to result from multiple actions. Many events leading up to such accidents include procedural and process factors, suggesting that hard-skills technical training is an insufficient approach to accident prevention. Contains 26 references. (SAS)

Hovelynck, Johan

1998-01-01

26

An analysis of aircraft accidents involving fires  

NASA Technical Reports Server (NTRS)

All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

1975-01-01

27

FSAR fire accident analysis for a plutonium facility  

SciTech Connect

The Final Safety Analysis Report (FSAR) for a plutonium facility as required by DOE Orders 5480.23 and 5480.22 has recently been completed and approved. The facility processes and stores radionuclides such as Pu-238, Pu-239, enriched uranium, and to a lesser degree other actinides. This facility produces heat sources. DOE Order 5480.23 and DOE-STD-3009-94 require analysis of different types of accidents (operational accidents such as fires, explosions, spills, criticality events, and natural phenomena such as earthquakes). The accidents that were analyzed quantitatively, or the Evaluation Basis Accidents (EBAs), were selected based on a multi-step screening process that utilizes extensively the Hazards Analysis (HA) performed for the facility. In the HA, specific accident scenarios, with estimated frequency and consequences, were developed for each identified hazard associated with facility operations and activities. Analysis of the EBAs and comparison of their consequences to the evaluation guidelines established the safety envelope for the facility and identified the safety-class structures, systems, and components. This paper discusses the analysis of the fire EBA. This fire accident was analyzed in relatively great detail in the FSAR because of its potential off-site consequences are more severe compared to other events. In the following, a description of the scenario is first given, followed by a brief summary of the methodology for calculating the source term. Finally, the author discuss how a key parameter affecting the source term, the leakpath factor, was determined, which is the focus of this paper.

Lam, K.

1997-06-01

28

Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods  

SciTech Connect

The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

2000-07-31

29

Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods  

SciTech Connect

The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

2000-08-01

30

Risk analysis methodology survey  

NASA Technical Reports Server (NTRS)

NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

Batson, Robert G.

1987-01-01

31

Anthropotechnological analysis of industrial accidents in Brazil.  

PubMed Central

The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

Binder, M. C.; de Almeida, I. M.; Monteau, M.

1999-01-01

32

Single pilot IFR accident data analysis  

NASA Technical Reports Server (NTRS)

The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

Harris, D. F.; Morrisete, J. A.

1982-01-01

33

On accident analysis and slip-resistance measurement  

Microsoft Academic Search

Four accident analysis models are discussed in relation to a specimen slipping accident. One classification, or ‘single-type’ model, commonly used in accident statistics, lumps all events and agencies from each accident in one ‘cause’ group. This may lead to serious underestimates of particular hazards. In past occupational injury statistics for Sweden, slipping appeared in less than 5 % of all

LENNART STRANDBERG

1983-01-01

34

Accident patterns for construction-related workers: a cluster analysis  

NASA Astrophysics Data System (ADS)

The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

Liao, Chia-Wen; Tyan, Yaw-Yauan

2012-01-01

35

A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT  

E-print Network

A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT Thesis submitted in partial fulfilment;A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT Paul S. Nelson 2 #12;Acknowledgements I want pressure" (Dekker, 2007, p. 131) A new, holistic systems perspective, accident model is used for analysis

Leveson, Nancy

36

Hanford waste tank bump accident analysis  

SciTech Connect

This report provides a new evaluation of the Hanford tank bump accident analysis (HNF-SD-Wh4-SAR-067 2001). The purpose of the new evaluation is to consider new information and to support new recommendations for final safety controls. This evaluation considers historical data, industrial failure modes, plausible accident scenarios, and system responses. A tank bump is a postulated event in which gases, consisting mostly of water vapor, are suddenly emitted from the waste and cause tank headspace pressurization. A tank bump is distinguished from a gas release event in two respects: First, the physical mechanism for release involves vaporization of locally superheated liquid, and second, gases emitted to the head space are not flammable. For this reason, a tank bump is often called a steam bump. In this report, even though non-condensible gases may be considered in bump models, flammability and combustion of emitted gases are not. The analysis scope is safe storage of waste in its current configuration in single-shell tanks (SSTs) and double-shell tanks (DSTs). The analysis considers physical mechanisms for tank bump to formulate criteria for bump potential, application of the criteria to the tanks, and accident analysis of bump scenarios. The result of consequence analysis is the mass of waste released from tanks for specific scenarios where bumps are credible; conversion to health consequences is performed elsewhere using standard Hanford methods (Cowley et al. 2000). The analysis forms a baseline for future extension to consider waste retrieval.

MALINOVIC, B.

2003-03-21

37

Recent Methodology in Ginseng Analysis  

PubMed Central

As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

2012-01-01

38

Improved dose assessment in a nuclear reactor accident using the old and new ICRP methodologies  

E-print Network

that the contaminated air was infinite in extent and that the radionuclide concentration was uniform throughout the medium. A tabulation of the dose conversion factors for plume shine is presented in Table 9 for fifty radionuclides and organs of interest...IMPROVED DOSE ASSESSMENT IN A NUCLEAR REACTOR ACCIDENT USING THE OLD AND NEW ICRP METHODOLOGIES A Thesis by SUK-CHUL YOON Submitted to the Graduate College of Texas A ff M University in partial fulfillment of the requirements for the degree...

Yoon, Suk-Chul

1987-01-01

39

Canister Storage Building (CSB) Design Basis Accident Analysis Documentation  

SciTech Connect

This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

CROWE, R.D.; PIEPHO, M.G.

2000-03-23

40

Canister Storage Building (CSB) Design Basis Accident Analysis Documentation  

SciTech Connect

This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

CROWE, R.D.

1999-09-09

41

Canister storage building design basis accident analysis documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

KOPELIC, S.D.

1999-02-25

42

RAILROAD ACCIDENT RATES FOR USE IN TRANSPORTATION RISK ANALYSIS  

Microsoft Academic Search

Annual safety statistics published by FRA provide train accident counts for various groupings, such as railroad, accident type, cause, track type and class, train length, and speed. However, hazardous materials transportation risk analysis often requires more detailed accident rate statistics for specific combinations of these groupings. The statistics that are presented enable more precise determination of the probability that Class

Robert T Anderson; Christopher P. L. Barkan

2004-01-01

43

Hazmat transport: a methodological framework for the risk analysis of marshalling yards.  

PubMed

A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation. PMID:17418942

Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

2007-08-17

44

PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS  

SciTech Connect

Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

(NOEMAIL), K; Jonathan Lowrie, J; David Thoman (NOEMAIL), D; Austin Keller (NOEMAIL), A

2008-07-30

45

Categorizing accident sequences in the external radiotherapy for risk analysis  

PubMed Central

Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005

2013-01-01

46

A methodology for generating dynamic accident progression event trees for level-2 PRA  

SciTech Connect

Currently, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. A software tool (ADAPT) is described for automated APET generation using the concept of dynamic event trees. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. While the software tool could be applied to any systems analysis code, the MELCOR code is used for this illustration. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a pressurized water reactor. (authors)

Hakobyan, A.; Denning, R.; Aldemir, T. [Ohio State Univ., Nuclear Engineering Program, 650 Ackerman Road, Columbus, OH 43202 (United States); Dunagan, S.; Kunsman, D. [Sandia National Laboratory, Albuquerque, NM 87185 (United States)

2006-07-01

47

Study of possibility using LANL PSA-methodology for accident probability RBMK researches  

SciTech Connect

The reactor facility probabilistic safety analysis methodologies are considered which are used at U.S. LANL and RF NIKIET. The methodologies are compared in order to reveal their similarity and differences, determine possibilities of using the LANL technique for RBMK type reactor safety analysis. It is found that at the PSA-1 level the methodologies practically do not differ. At LANL the PHA, HAZOP hazards analysis methods are used for more complete specification of the accounted initial event list which can be also useful at performance of PSA for RBMK. Exchange of information regarding the methodology of detection of dependent faults and consideration of human factor impact on reactor safety is reasonable. It is accepted as useful to make a comparative study result analysis for test problems or PSA fragments using various computer programs employed at NIKIET and LANL.

Petrin, S.V.; Yuferev, V.Y.; Zlobin, A.M.

1995-12-31

48

TMI-2 accident: core heat-up analysis  

SciTech Connect

This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

Ardron, K.H.; Cain, D.G.

1981-01-01

49

Aircraft accidents.method of analysis  

NASA Technical Reports Server (NTRS)

This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

1937-01-01

50

Analysis of accidents during instrument approaches  

NASA Technical Reports Server (NTRS)

General aviation and air taxi approach phase accidents, which occurred during VFR and IFR, respectively over the last 25 years, were analyzed. The data suggest that there is a 204 percent higher risk during the approach and landing phase of VFR flights, than during similar IFR operations (14.82 vs 7.27 accidents/100,000 approaches). Alarmingly, the night single pilot IFR (SPIFR) accident rate is almost 8 times the rate of day IFR, 35.43 vs 4.47 accidents/100,000 approaches, and two and a half times that of day VFR approaches, 35.43 vs 14.82 accidents/100,000 approaches. Surprisingly, the overall SPIFR accident rates are not much higher than dual-pilot IFR (DPIFR), 7.27 vs 6.48 accidents/100,000 approaches. The generally static ratio of the statistics for SPIFR/DPIFR accident rates may be accounted for by little or no change in general aviation cockpit technology during the last 25 years, and because IFR operational flight task management training has not kept pace.

Bennett, C. T.; Schwirzke, M.

1992-01-01

51

NASA's Accident Precursor Analysis Process and the International Space Station  

NASA Technical Reports Server (NTRS)

This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

Groen, Frank; Lutomski, Michael

2010-01-01

52

SIMMER-III : Applications to reactor accident analysis  

Microsoft Academic Search

The SIMMER-III code was originally developed for the safety analysis of core disruptive accident in sodium cooled fast breeder reactor. In parallel to its development, a systematic programme of evaluation was successfully performed. Using the new detailed pin modelling, SIMMER-III is now able to describe a complete sequence of accident for sodium cooled fast reactor. Some tests of the international

T. CADIOU; W. MASCHEK; A. RINEISKI; Forschungszentrum Karlsruhe

53

Rat sperm motility analysis: methodologic considerations  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

54

An analysis of ambulance accidents in Tennessee.  

PubMed

In an effort to improve our program for ambulance crash and related injury prevention, we analyzed 102 consecutive ambulance accidents. Incidents reported included those that resulted in human injury or in which property damage exceeded $200. Multiple logistic regression was used to determine the association between circumstances reported at the time of the accidents and the risk of injury. Twenty-nine accidents contributed to a total of 65 injured victims, with one death. The variable most strongly associated with the probability of an injury-accident was use of a passenger restraint device. Darkness and occurrence at an intersection were variables showing increased risk, but were not statistically significant. The interaction of variables did not have a combined influence on the incidence of injury. The mean delay to hospital care after an accident was 9.4 minutes. Based on our data, we conclude that passenger restraints for both ambulance attendants and passengers should be mandatory, and we suggest that traffic signals be strictly heeded at intersections and speed limits in urban settings be obeyed. PMID:3625947

Auerbach, P S; Morris, J A; Phillips, J B; Redlinger, S R; Vaughn, W K

1987-09-18

55

The Analysis of a Friendly Fire Accident using a Systems Model of Accidents* N.G. Leveson, Ph.D.; Massachusetts Institute of Technology; Cambridge, Massachusetts  

E-print Network

The Analysis of a Friendly Fire Accident using a Systems Model of Accidents* N.G. Leveson, Ph.D.; University of Victoria; Victoria, Canada Keywords: accident analysis, accident models Abstract In another paper presented at this conference, Leveson describes a new accident model based on systems theory [2

Leveson, Nancy

56

Systemic accident analysis: examining the gap between research and practice.  

PubMed

The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

Underwood, Peter; Waterson, Patrick

2013-06-01

57

Organization Needs Analysis: A New Methodology.  

ERIC Educational Resources Information Center

Describes a rationale, instrument, and methodology to improve the practice of organization needs analysis. The approach involves the author's "career of the organization" exercise for the trainer's use in identifying organization problems and in conceptualizing the nature of needs-analysis problem solving relating to both individual and…

Leach, John J.

1979-01-01

58

Foucault's Analysis of Power's Methodologies  

E-print Network

FOUCAULT'S ANALYSIS OF POWER'S M E T H O D O L O G I E S Gary Alan Scott Siena College Part 4 of The Will to Know, where Foucault turns to his most direct treatment of power, contains the often-cited statement, "In political thought... University for this phraseology and for his many helpful comments on an earlier draft of this paper. I would also like to express my appreciation for the thorough and incisive comments offered by Scott M. Christensen, of the University of California...

Scott, Gary Alan

59

Use of Stepwise Methodology in Discriminant Analysis.  

ERIC Educational Resources Information Center

The use of stepwise methodologies has been sharply criticized by several researchers, yet their popularity, especially in educational and psychological research, continues unabated. Stepwise methods have been considered particularly well suited for use in regression and discriminant analyses, but their use in discriminant analysis (predictive…

Whitaker, Jean S.

60

TMI-2 accident: core heat-up analysis. A supplement  

SciTech Connect

Following the accident at Three Mile Island, Unit 2, NSAC mounted an analytical program to develop a chronology of what happened in the core during the period when damage occurred. The central effort and key results of this analytical work are described in NSAC-24, TMI-2 Accident Core Heatup Analysis. Several supporting studies contributed to this central effort. These are presented in this supplement. Part I describes a single pin analysis that was made using the FRAP-T5 code. This analysis provided input to the core damage assessment central effort. Part II describes a thermal hydraulic analysis of the core during the accident using the BOIL 2 code. The BOIL 2 analysis of TMI-2 core was performed to provide an independent check on the results of the main core damage assessment effort. Part III provides the as-built design and material characteristics of the TMI-2 core. This supplement will be of greatest interest to analysts who are studying the TMI-2 accident or are investigating how other cores would behave during a boil-down event.

Not Available

1981-06-01

61

Analysis of the 1957-1958 Soviet Nuclear Accident  

Microsoft Academic Search

The presence of an extensive environmental contamination zone in Cheliabinsk Province of the Soviet Union, associated with an accident in the winter of 1957 to 1958 involving the atmospheric release of fission wastes, appears to have been confirmed, primarily by an analysis of the Soviet radioecology literature. The contamination zone is estimated to contain 105 to 106 curies of strontium-90

John R. Trabalka; L. Dean Eyman; Stanley I. Auerbach

1980-01-01

62

Core Disruptive Accident Analysis using ASTERIA-FBR  

NASA Astrophysics Data System (ADS)

JNES is developing a core disruptive accident analysis code, ASTERIA-FBR, which tightly couples the thermal-hydraulics and the neutronics to simulate the core behavior during core disruptive accidents of fast breeder reactors (FBRs). ASTERIA-FBR consists of the three-dimensional thermal-hydraulics calculation module: CONCORD, the fuel pin behavior calculation module: FEMAXI-FBR, and the space-time neutronics module: Dynamic-GMVP or PARTISN/RKIN. This paper describes a comparison between characteristics of GMVP and PARTISN and summarizes the challenging issues on applying Dynamic-GMVP to the calculation against unprotected loss-of-flow (ULOF) event which is a typical initiator of core disruptive accident of FBR. The statistical error included in the calculation results may affect the super-prompt criticality during ULOF event and thus the amount of released energy.

Ishizu, Tomoko; Endo, Hiroshi; Yamamoto, Toshihisa; Tatewaki, Isao

2014-06-01

63

Three dimensional effects in analysis of PWR steam line break accident  

E-print Network

A steam line break accident is one of the possible severe abnormal transients in a pressurized water reactor. It is required to present an analysis of a steam line break accident in the Final Safety Analysis Report (FSAR) ...

Tsai, Chon-Kwo

64

The role of safety analysis in accident prevention.  

PubMed

The need for safety analysis has grown in the fields of nuclear industry, civil and military aviation and space technology where the potential for accidents with far-reaching consequences for employees, the public and the environment is most apparent. Later the use of safety analysis has spread widely to other industrial branches. General systems theory, accident theories and scientific management represent domains that have influenced the development of safety analysis. These relations are shortly presented and the common methods employed in safety analysis are described and structured according to the aim of the search and to the search strategy. A framework for the evaluation of the coverage of the search procedures employed in different methods of safety analysis is presented. The framework is then used in an heuristic and in an empiric evaluation of hazard and operability study (HAZOP), work safety analysis (WSA), action error analysis (AEA) and management oversight and risk tree (MORT). Finally, some recommendations on the use of safety analysis for preventing accidents are presented. PMID:3337767

Suokas, J

1988-02-01

65

Cold Vacuum Drying facility design basis accident analysis documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

CROWE, R.D.

2000-08-08

66

Mass Spectrometry Methodology in Lipid Analysis  

PubMed Central

Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics technology greatly push forward the study of lipidomics. Among them, mass spectrometry (MS) is the most important technology for lipid analysis. In this review, the methodology based on MS for lipid analysis was introduced. It is believed that along with the rapid development of MS and its further applications to lipid analysis, more functional lipids will be identified as biomarkers and therapeutic targets and for the study of the mechanisms of disease. PMID:24921707

Li, Lin; Han, Juanjuan; Wang, Zhenpeng; Liu, Jian’an; Wei, Jinchao; Xiong, Shaoxiang; Zhao, Zhenwen

2014-01-01

67

Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident  

NASA Technical Reports Server (NTRS)

Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

1994-01-01

68

UNE METHODE D'ESTIMATION DE LA PROBABILITE DES ACCIDENTS MAJEURS DE BARRAGES : LA METHODE DU NOEUD PAPILLON  

E-print Network

UNE METHODE D'ESTIMATION DE LA PROBABILITE DES ACCIDENTS MAJEURS DE BARRAGES : LA METHODE DU NOEUD PAPILLON A methodology to estimate probability of dams major accident: a bow tie approach Christophe accidents. INERIS assesses the probability of major accident through a methodology based on the analysis

Paris-Sud XI, Université de

69

Net Environmental Benefit Analysis: A New Assessment Methodology  

E-print Network

Net Environmental Benefit Analysis: A New Assessment Methodology R. A. Efroymson, efroymsonra.S. Department of Energy Dec-05 Net Environmental Benefit Analysis: A New Assessment Methodology R. A. Efroymson. Termed "Net Environmental Benefit Analysis" or NEBA, this methodology compares and ranks net

70

Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

71

Analysis of PWR RCS Injection Strategy During Severe Accident  

SciTech Connect

Reactor coolant system (RCS) injection is an important strategy for severe accident management of a pressurized water reactor (PWR) system. Maanshan is a typical Westinghouse PWR nuclear power plant (NPP) with large, dry containment. The severe accident management guideline (SAMG) of Maanshan NPP is developed based on the Westinghouse Owners Group (WOG) SAMG.The purpose of this work is to analyze the RCS injection strategy of PWR system in an overheated core condition. Power is assumed recovered as the vessel water level drops to the bottom of active fuel. The Modular Accident Analysis Program version 4.0.4 (MAAP4) code is chosen as a tool for analysis. A postulated station blackout sequence for Maanshan NPP is cited as a reference case for this analysis. The hot leg creep rupture occurs during the mitigation action with immediate injection after power recovery according to WOG SAMG, which is not desired. This phenomenon is not considered while developing the WOG SAMG. Two other RCS injection methods are analyzed by using MAAP4. The RCS injection strategy is modified in the Maanshan SAMG. These results can be applied for typical PWR NPPs.

Wang, S.-J. [Institute of Nuclear Energy Research, Taiwan (China); Chiang, K.-S. [Institute of Nuclear Energy Research, Taiwan (China); Chiang, S.-C. [Taiwan Power Company, Taiwan (China)

2004-05-15

72

Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology  

NASA Astrophysics Data System (ADS)

With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

Bhalla, P.; Tripathi, S.; Palria, S.

2014-12-01

73

Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.  

PubMed

Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended. PMID:25179119

Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

2014-09-01

74

NASA Accident Precursor Analysis Handbook, Version 1.0  

NASA Technical Reports Server (NTRS)

Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare.

Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

2011-01-01

75

Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis  

SciTech Connect

The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

Gilles Youinou; R. Sonat Sen

2013-09-01

76

Comprehensive Analysis of Two Downburst-Related Aircraft Accidents  

NASA Technical Reports Server (NTRS)

Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

Shen, J.; Parks, E. K.; Bach, R. E.

1996-01-01

77

A general methodology for population analysis  

NASA Astrophysics Data System (ADS)

For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ?, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ?, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn,  n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

Lazov, Petar; Lazov, Igor

2014-12-01

78

A DISCIPLINED APPROACH TO ACCIDENT ANALYSIS DEVELOPMENT AND CONTROL SELECTION  

SciTech Connect

The development and use of a Safety Input Review Committee (SIRC) process promotes consistent and disciplined Accident Analysis (AA) development to ensure that it accurately reflects facility design and operation; and that the credited controls are effective and implementable. Lessons learned from past efforts were reviewed and factored into the development of this new process. The implementation of the SIRC process has eliminated many of the problems previously encountered during Safety Basis (SB) document development. This process has been subsequently adopted for use by several Savannah River Site (SRS) facilities with similar results and expanded to support other analysis activities.

Ortner, T; Mukesh Gupta, M

2007-04-13

79

Analysis of Three Mile Island-Unit 2 accident  

SciTech Connect

The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

Not Available

1980-03-01

80

An Integrated Accident & Consequence Analysis Approach for Accidental Releases through Multiple Leak Paths  

SciTech Connect

This paper presents a consequence analysis for a postulated fire accident on a building containing plutonium when the resulting outside release is partly through the ventilation/filtration system and partly through other pathways such as building access doorways. When analyzing an accident scenario involving the release of radioactive powders inside a building, various pathways for the release to the outside environment can exist. This study is presented to guide the analyst on how the multiple building leak path factors (combination of filtered and unfiltered releases) can be evaluated in an integrated manner starting with the source term calculation and proceeding through the receptor consequence determination. The analysis is performed in a two-step process. The first step of the analysis is to calculate the leak path factor, which represents the fraction of respirable radioactive powder that is made airborne that leaves the building through the various pathways. The computer cod e of choice for this determination is MELCOR. The second step is to model the transport and dispersion of powder material released to the atmosphere and to estimate the resulting dose that is received by the downwind receptors of interest. The MACCS computer code is chosen for this part of the analysis. This work can be used as model for performing analyses for systems similar in nature where releases can propagate to the outside environment via filtered and unfiltered pathways. The methodology provides guidance to analysts outlining the essential steps needed to perform a sound and defensible consequence analysis.

POLIZZI, LM

2004-04-28

81

Accident Analysis and Prevention, 2012 (49), pp 73-77 www.elsevier.com/locate/aap  

E-print Network

1 Accident Analysis and Prevention, 2012 (49), pp 73-77 www.elsevier.com/locate/aap doi:10.1016/j.aap.2011.07.013 Motorcyclists' speed and "looked-but-failed-to-see" accidents Nicolas Clabaux, Thierry of accidents in which a non-priority road user failed to give way to an approaching motorcyclist without seeing

Paris-Sud XI, Université de

82

Human errors reliability analysis in coal mine accidents based on Gray Relational Theory  

Microsoft Academic Search

Human errors which have been affecting safety accidents are one of the main reasons in coal mine. So these Accidents can be pre vented and reduced through analyzing human errors affecting factors. This paper has made elaborate analysis of the relative affecting factors which cause human errors through applying the Gray Relational Theory in coal mine accidents. Based upon this

Jianyi Lan; Meiying Qiao

2010-01-01

83

Predicting System Accidents with Model Analysis During Hybrid Simulation  

NASA Technical Reports Server (NTRS)

Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

Malin, Jane T.; Fleming, Land D.; Throop, David R.

2002-01-01

84

N. Bourdet, C. Deck, T. Serre, C. Perrin, M. Llari, R. Willinger, Methodology for a global bicycle real world accidents reconstruction, International Crashworthiness Conference, July 18-20, 2012, Politecnico Milano,  

E-print Network

real world accidents reconstruction, International Crashworthiness Conference, July 18-20, 2012, Politecnico ­ Milano, 2012-077 1 Methodology for a global bicycle real world accidents reconstruction N are available concerning the head impact loading in case of real accidents. Therefore, the objective

Boyer, Edmond

85

Cross-database analysis to identify relationships between aircraft accidents and incidents  

NASA Astrophysics Data System (ADS)

Air transportation systems are designed to ensure that aircraft accidents are rare events. To minimize these accidents, factors causing or contributing to accidents must be understood and prevented. Despite many efforts by the aviation safety community to reduce the accidents, accident rates have been stable for decades. One explanation could be that direct and obvious causes that previously occurred with a relatively high frequency have already been addressed over the past few decades. What remains is a much more difficult challenge---identifying less obvious causes, combinations of factors that, when occurring together, may potentially lead to an accident. Contributions of this research to the aviation safety community are two-fold: (1) The analyses conducted in this research, identified significant accident factors. Detection and prevention of these factors, could prevent potential future accidents. The holistic study made it possible to compare the factors in a common framework. The identified factors were compared and ranked in terms of their likelihood of being involved in accidents. Corrective actions by the FAA Aviation Safety Oversight (ASO), air carrier safety offices, and the aviation safety community in general, could target the high-ranked factors first. The aviation safety community can also use the identified factors as a benchmark to measure and compare safety levels in different periods and/or at different regions. (2) The methodology established in this study, can be used by researchers in future studies. By applying this methodology to the safety data, areas prone to future accidents can be detected and addressed. Air carriers can apply this methodology to analyze their proprietary data and find detailed safety factors specific to their operations. The Factor Support Ratio metric introduced in this research, can be used to measure and compare different safety factors. (Abstract shortened by UMI.)

Nazeri, Zohreh

86

LOSS OF COOLANT ACCIDENT AND LOSS OF FLOW ACCIDENT ANALYSIS OF THE ARIES-AT DESIGN E. A. Mogahed, L. El-Guebaly, A. Abdou, P. Wilson, D. Henderson and the ARIES Team  

E-print Network

LOSS OF COOLANT ACCIDENT AND LOSS OF FLOW ACCIDENT ANALYSIS OF THE ARIES-AT DESIGN E. A. Mogahed, L accident (LOCA) and loss of flow accident (LOFA) analysis is performed for ARIES-AT, an advanced fusion of steel in the reactor is about (600 °C - 700°C) after about 4 days from the onset of the accident

California at San Diego, University of

87

Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.  

PubMed

According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

2014-04-01

88

Offsite radiological consequence analysis for the bounding aircraft crash accident  

SciTech Connect

The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

OBERG, B.D.

2003-03-22

89

Negative Binomial Analysis of Intersection-Accident Frequencies  

Microsoft Academic Search

Traffic accidents at urban intersections result in a huge cost to society in tenns of death, injury, lost productivity, and property damage. Unfortunately, the elements that effect the frequency of intersection accidents are not well understood and, as a result, it is difficult to predict the effectiveness of specific intersection improvements that are aimed at reducing accident frequency. Using seven-yr

Mark Poch; Fred Mannering

1996-01-01

90

Geographical information systems aided traffic accident analysis system case study: city of Afyonkarahisar.  

PubMed

Geographical Information System (GIS) technology has been a popular tool for visualization of accident data and analysis of hot spots in highways. Many traffic agencies have been using GIS for accident analysis. Accident analysis studies aim at the identification of high rate accident locations and safety deficient areas on the highways. So, traffic officials can implement precautionary measures and provisions for traffic safety. Since accident reports are prepared in textual format in Turkey, this situation makes it difficult to analyze accident results. In our study, we developed a system transforming these textual data to tabular form and then this tabular data were georeferenced onto the highways. Then, the hot spots in the highways in Afyonkarahisar administrative border were explored and determined with two different methods of Kernel Density analysis and repeatability analysis. Subsequently, accident conditions at these hot spots were examined. We realized that the hot spots determined with two methods reflect really problematic places such as cross roads, junction points etc. Many of previous studies introduced GIS only as a visualization tool for accident locations. The importance of this study was to use GIS as a management system for accident analysis and determination of hot spots in Turkey with statistical analysis methods. PMID:18215546

Erdogan, Saffet; Yilmaz, Ibrahim; Baybura, Tamer; Gullu, Mevlut

2008-01-01

91

Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K  

SciTech Connect

Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun [Korea Electric Power Research Institute (Korea, Republic of)

2004-10-15

92

Analysis of the Chernobyl accident taking core destruction into account  

SciTech Connect

Computer modeling of neutron-physical and thermohydraulic processes has been used extensively in the analysis of the Chrenobyl accident. In most works the first phase of the accident (up to the moment of destruction of the fuel) is studies. These studies have revealed serious deficiencies of RBMK reactors that resulted in a n nonroutine state with catastrophic consequences during operation: a large positive steam reactivity coefficient and a defect in the construction of the control rods, specifically, the possibility of a reactivity increase as a result of displacement of water to the core bottom when the safety and control rods are inserted in the process of stopping the reactor. The important role of the last factor has been noted in many investigations. In some works a large power burst was obtained during modeling, neglecting the effect of the rods, as a result of other external actions, for example, cavitation. Calculations performed using three-dimensional neutron-thermohydraulic programs with maximum use of accessible initial information have shown that both factors are important. If one of them artificially excluded, it is impossible to obtain the power burst which explains the explosion.

Afanas`eva, A.A.; Fedosov, A.M.; Donderer, R. [Russian Scientific Center Kurchatovskii Institut, Bremen (Germany)] [and others

1995-02-01

93

An Accident Precursor Analysis Process Tailored for NASA Space Systems  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

2010-01-01

94

An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)  

NASA Technical Reports Server (NTRS)

A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

2002-01-01

95

Hypothetical accident conditions thermal analysis of the 5320 package  

SciTech Connect

An axisymmetric model of the 5320 package was created to perform hypothetical accident conditions (HAC) thermal calculations. The analyses assume the 5320 package contains 359 grams of plutonium-238 (203 Watts) in the form of an oxide powder at a minimum density of 2.4 g/cc or at a maximum density of 11.2 g/cc. The solution from a non-solar 100 F ambient steady-state analysis was used as the initial conditions for the fire transient. A 30 minute 1,475 F fire transient followed by cooling via natural convection and thermal radiation to a 100 F non-solar environment was analyzed to determine peak component temperatures and vessel pressures. The 5320 package was considered to be horizontally suspended within the fire during the entire transient.

Hensel, S.J.; Gromada, R.J.

1995-12-31

96

Aircraft Accident Prevention: Loss-of-Control Analysis  

NASA Technical Reports Server (NTRS)

The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

2009-01-01

97

Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident  

PubMed Central

In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms. PMID:22496858

Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

2012-01-01

98

Advanced accident sequence precursor analysis level 1 models  

SciTech Connect

INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O. [Idaho National Engineering Lab., Idaho National Lab., Idaho Falls, ID (United States)

1996-03-01

99

A methodology for agent-oriented analysis and design  

Microsoft Academic Search

This paper presents a methodology for agent-oriented analysis and design. The methodology is general, in that it is applicable to a wide range of multi-agent systems, and comprehensive, in that it deals with both the macro-level (societal) and the micro-level (agent) aspects of systems. The methodology is founded on the view of a system as a computational organisation consisting of

Michael Wooldridge; Nicholas R. Jennings; David Kinnyt

1999-01-01

100

PWR integrated safety analysis methodology using multi-level coupling algorithm  

NASA Astrophysics Data System (ADS)

Coupled three-dimensional (3D) neutronics/thermal-hydraulic (T-H) system codes give a unique opportunity for a realistic modeling of the plant transients and design basis accidents (DBA) occurring in light water reactors (LWR). Examples of such DBAs are the rod ejection accidents (REA) and the main steam line break (MSLB) that constitute the bounding safety problems for pressurized water reactors (PWR). These accidents involve asymmetric 3D spatial neutronic and T-H effects during the course of the transients. The thermal margins (the peak fuel temperature, and departure from nucleate boiling ratio (DNBR)) are the measures of safety at a particular transient and need to be evaluated as accurate as possible. Modern 3D neutronics/T-H coupled codes estimate the safety margins coarsely on an assembly level, i.e. for an average fuel pin. More accurate prediction of the safety margins requires the evaluation of the transient fuel rod response involving locally coupled neutronics/T-H calculations. The proposed approach is to perform an on-line hot-channel safety analysis not for the whole core but for a selected local region, for example for the highest power loaded fuel assembly. This approach becomes feasible if an on-line algorithm capable to extract the necessary input data for a sub-channel module is available. The necessary input data include the detailed pin-power distributions and the T-H boundary conditions for each sub-channel in the considered problem. Therefore, two potential challenges are faced in the development of refined methodology for evaluation of local safety parameters. One is the development of an efficient transient pin-power reconstruction algorithm with a consistent cross-section modeling. The second is the development of a multi-level coupling algorithm for the T-H boundary and feed-back data exchange between the sub-channel module and the main 3D neutron kinetics/T-H system code, which already uses one level of coupling scheme between 3D neutronics and core thermal-hydraulics models. The major accomplishment of the thesis is the development of an integrated PWR safety analysis methodology with locally refined safety evaluations. This involved introduction of an improved method capable of efficiently restoring the fine pin-power distribution with a high degree of accuracy. In order to apply the methodology to evaluate the safety margins on a pin level, a refined on-line hot channel model was developed accounting for the cross-flow effects. Finally, this methodology was applied to best estimate safety analysis to more accurately calculate the thermal safety margins occurring during a design basis accident in PWR.

Ziabletsev, Dmitri Nickolaevich

101

Methodology, Metrics and Measures for Testing and Evaluation of Intelligence Analysis Tools  

E-print Network

PNWD-3550 Methodology, Metrics and Measures for Testing and Evaluation of Intelligence Analysis ........................................................................................................................2 4. Research Methodology ..................................................................................................2 4.1 Overview of the Research Methodology Problem

102

Analysis of Severe Accident Scenarios in APR1400 Using the MAAP4 Code  

Microsoft Academic Search

An analysis of containment environments during a postulated severe accident was performed for a pressurized water reactor (PWR) type nuclear power plant (NPP) of which electric power is 1400 MWe. Examined were four initiating events such as large break loss-of-coolant accident (LBLOCA), small break loss-of-coolant accident (SBLOCA), total loss of feedwater (TLOFW) and station blackout (SBO). These events are selected

Ji Hwan Jeong; M. G. Na; S. P. Kim; Jong Woon Park

2002-01-01

103

Methodology for supply chain disruption analysis  

Microsoft Academic Search

Given the size, complexity and dynamic nature of many supply chains, there is a need to understand the impact of disruptions on the operation of the system. This paper presents a network-based modelling methodology to determine how changes or disruptions propagate in supply chains and how those changes or disruptions affect the supply chain system. Understanding the propagation of disruptions

T. Wu; J. Blackhurst; P. O’grady

2007-01-01

104

Developing a methodology for road network vulnerability analysis  

Microsoft Academic Search

This paper will describe a new project at KTH conce rning road network vulnerability, as well as present some early results. The aim of the proje ct is to develop the methodology of vulner- ability analysis for road networks and to illustrat e how this methodology can be used for gen- erating a basis for the decision-making process con cerning

Erik Jenelius; Lars-Göran Mattsson

105

Social Network Analysis in Human Resource Development: A New Methodology  

Microsoft Academic Search

Through an exhaustive review of the literature, this article looks at the applicability of social network analysis (SNA) in the field of humanresource development. The literature review revealed that a number of disciplines have adopted this unique methodology, which has assisted in the development of theory. SNA is a methodology for examining the structure among actors, groups, and organizations and

John-Paul Hatala

2006-01-01

106

A methodology for human factors analysis in office automation systems  

Microsoft Academic Search

A methodology for computer-aided human factors analysis in office automation system (OAS) design and implementation process has been developed. It incorporates a fuzzy knowledge-based evaluation mechanism, which is employed to aggregate data measured in scales of different type. The methodology has a high degree of flexibility, which allows it to be adjusted to the individual client situation. A case study

ALEXANDER NIKOV; GIACINTO MATARAZZO; ANTONINO ORLANDO

1993-01-01

107

Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident  

SciTech Connect

Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

Su'ud, Zaki; Anshari, Rio [Nuclear and Biophysics Research Group, Dept. of Physics, Bandung Institute of Technology, Jl.Ganesha 10, Bandung, 40132 (Indonesia)

2012-06-06

108

Application of Evidential Networks in quantitative analysis of railway accidents  

E-print Network

's risk assessments. Modeling of human errors through probabilistic approaches has shown some limitation' opinion. I. INTRODUCTION There is little doubt that human error is the most significant source of accidents or incidents in railway systems. According to statistics of railway accidents in Korea, human

Paris-Sud XI, Université de

109

Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1  

SciTech Connect

Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

Oztunali, O.I.; Roles, G.W.

1986-01-01

110

Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.  

SciTech Connect

This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d'%C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

2011-06-01

111

Integrated safety analysis code system (ISAS) application for accident sequence analyses  

Microsoft Academic Search

In the frame of the ITER Task ‘Reference Accident Sequences’, two accident sequences have been assessed to demonstrate the effectiveness of the use of integrated safety analysis code system (ISAS). The first one is a loss of coolant event in the divertor primary heat transfer system (DV PHTS) towards the vacuum vessel containment during normal plasma burn; the second one

M. T Porfiri; G Cambi

2000-01-01

112

Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny  

E-print Network

Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny , Jean-Etienne T. Dongmo NASA Langley Research Center, MS 161, Hampton, VA, 23681. The majority of fatal aircraft accidents the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft

Kwatny, Harry G.

113

DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)  

SciTech Connect

This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

Young, K. R.; Augustine, C.; Anderson, A.

2010-02-01

114

A Taxonomic Class Modeling Methodology for Object-Oriented Analysis  

Microsoft Academic Search

Discovering a set of domain classes during object-oriented analysis is intellectually challenging and time-consuming for novice analyzers. This chapter presents a taxonomic class modeling (TCM) methodology that can be used for object-oriented analysis in business applications. Our methodology helps us discover the three types of classes: (1) classes represented by nouns in the requirement specification, (2) classes whose concepts were

Il-yeol Song; Kurt Yano; Juan Trujillo; Sergio Luján-mora

2005-01-01

115

Analysis of Construction Accidents in Turkey and Responsible Parties  

PubMed Central

Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

GÜRCANLI, G. Emre; MÜNGEN, U?ur

2013-01-01

116

Risk analysis using a hybrid Bayesian-approximate reasoning methodology.  

SciTech Connect

Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

Bott, T. F. (Terrence F.); Eisenhawer, S. W. (Stephen W.)

2001-01-01

117

Use of inelastic analysis to determine the response of packages to puncture accidents  

SciTech Connect

The accurate analytical determination of the response of radioactive material transportation packages to the hypothetical puncture accident requires inelastic analysis techniques. Use of this improved analysis method recudes the reliance on empirical and approximate methods to determine the safety for puncture accidents. This paper will discuss how inelastic analysis techniques can be used to determine the stresses, strains and deformations resulting from puncture accidents for thin skin materials with different backing materials. A method will be discussed to assure safety for all of these types of packages.

Ammerman, D.J.; Ludwigsen, J.S.

1996-08-01

118

Accidents at work and costs analysis: a field study in a large Italian company.  

PubMed

Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

Battaglia, Massimo; Frey, Marco; Passetti, Emilio

2014-01-01

119

Analysis of Loss-of-Coolant Accidents in the NBSR  

SciTech Connect

This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

Baek J. S.; Cheng L.; Diamond, D.

2014-05-23

120

Modeling control room crews for accident sequence analysis  

E-print Network

This report describes a systems-based operating crew model designed to simulate the behavior of an nuclear power plant control room crew during an accident scenario. This model can lead to an improved treatment of potential ...

Huang, Y. (Yuhao)

1991-01-01

121

Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description  

PubMed Central

Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

da Silva, B. R.; Moreira Neto, J. J. S.; da Silva, F. I.; de Aguiar, A. S. W.

2011-01-01

122

Systems-based accident analysis in the led outdoor activity domain: application and evaluation of a risk management framework  

Microsoft Academic Search

Safety-compromising accidents occur regularly in the led outdoor activity domain. Formal accident analysis is an accepted means of understanding such events and improving safety. Despite this, there remains no universally accepted framework for collecting and analysing accident data in the led outdoor activity domain. This article presents an application of Rasmussen's risk management framework to the analysis of the Lyme

P. Salmon; A. Williamson; M. Lenné; E. Mitsopoulos-Rubens; C. M. Rudin-Brown

2010-01-01

123

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)  

E-print Network

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada, Hampton, VA 23681-2199, USA c.m.holloway@nasa.gov Abstract Accident reports provide important insights an analysis that extends across the findings presented over ten years investigations into maritime accidents

Williamson, John

124

A Longitudinal Analysis of the Causal Factors in Major Aviation Accidents in the USA from 1976 to 2006  

E-print Network

A Longitudinal Analysis of the Causal Factors in Major Aviation Accidents in the USA from 1976.m.holloway@larc.nasa.gov Abstract This paper forms part of a long term analysis to understand the causes of aviation accidents in accident and incident reports. We are also concerned to determine whether these causal factors have changed

Johnson, Chris

125

Statistical Methodology in Meta-Analysis.  

ERIC Educational Resources Information Center

Meta-analysis has become an important supplement to traditional methods of research reviewing, although many problems must be addressed by the reviewer who carries out a meta-analysis. These problems include identifying and obtaining appropriate studies, extracting estimates of effect size from the studies, coding or classifying studies, analyzing…

Hedges, Larry V.

126

Analysis and methodology for aeronautical systems technology program planning  

NASA Technical Reports Server (NTRS)

A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

White, M. J.; Gershkoff, I.; Lamkin, S.

1983-01-01

127

Methodology for error analysis and simulation of lidar aerosol measurements  

NASA Technical Reports Server (NTRS)

A methodology is presented for objective and automated determination of the uncertainty in lidar aerosol measurements. This methodology is based on standard error-propagation procedures, a large data base on atmospheric behavior, and long experience in lidar data processing. Algebraic expressions for probable error are derived as a function of the relevant parameters. The validity of these expressions is then tested by making simulated measurements and analyses in which random errors of appropriate size are injected at proper steps of the measurement and analysis process. An illustrative example is given where the methodology is applied to a new lidar system now being used for airborne measurements of the stratospheric aerosol.

Russell, P. B.; Swissler, T. J.; Mccormick, M. P.

1979-01-01

128

Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis  

NASA Technical Reports Server (NTRS)

NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

2005-01-01

129

RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

130

The Gaia Methodology for Agent-Oriented Analysis and Design  

Microsoft Academic Search

This article presents Gaia: a methodology for agent-oriented analysis and design. The Gaia methodology is both general, in that it is applicable to a wide range of multi-agent systems, and com- prehensive, in that it deals with both the macro-level (societal) and the micro-level (agent) aspects of systems. Gaia is founded on the view of a multi-agent system as a

Michael Wooldridge; Nicholas R. Jennings; David Kinny

2000-01-01

131

Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor  

SciTech Connect

The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

Yulianti, Yanti [Dept. of Physics, Universitas Lampung (UNILA), Jl. Sumantri Brojonegor No.1 Bandar Lampung (Indonesia); Dept. of Physics, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung (Indonesia); Su'ud, Zaki; Waris, Abdul; Khotimah, S. N. [Dept. of Physics, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung (Indonesia); Shafii, M. Ali [Dept. of Physics, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung (Indonesia); Dept. of Physics, Universitas Andalas (UNAND), Kampus Limau Manis, Padang, Sumatera Barat (Indonesia)

2010-12-23

132

Highway accident severities and the mixed logit model: an exploratory empirical analysis.  

PubMed

Many transportation agencies use accident frequencies, and statistical models of accidents frequencies, as a basis for prioritizing highway safety improvements. However, the use of accident severities in safety programming has been often been limited to the locational assessment of accident fatalities, with little or no emphasis being placed on the full severity distribution of accidents (property damage only, possible injury, injury)-which is needed to fully assess the benefits of competing safety-improvement projects. In this paper we demonstrate a modeling approach that can be used to better understand the injury-severity distributions of accidents on highway segments, and the effect that traffic, highway and weather characteristics have on these distributions. The approach we use allows for the possibility that estimated model parameters can vary randomly across roadway segments to account for unobserved effects potentially relating to roadway characteristics, environmental factors, and driver behavior. Using highway-injury data from Washington State, a mixed (random parameters) logit model is estimated. Estimation findings indicate that volume-related variables such as average daily traffic per lane, average daily truck traffic, truck percentage, interchanges per mile and weather effects such as snowfall are best modeled as random-parameters-while roadway characteristics such as the number of horizontal curves, number of grade breaks per mile and pavement friction are best modeled as fixed parameters. Our results show that the mixed logit model has considerable promise as a methodological tool in highway safety programming. PMID:18215557

Milton, John C; Shankar, Venky N; Mannering, Fred L

2008-01-01

133

A methodology for probabilistic fault displacement hazard analysis (PFDHA)  

USGS Publications Warehouse

We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H., III; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

2003-01-01

134

Methodology for the foresight analysis of electricity markets  

Microsoft Academic Search

This paper is about a methodology designed for the foresight analysis of the electricity markets as the technologic foresight tools and techniques and the analysis of the systemic development and special characteristics of the electricity markets, in order to have a facility for bringing forward the evolution and development of the electricity markets.

C. Diez; W. Zapata; M. Restrepo; O. Fernandez

2008-01-01

135

Safety analysis results for cryostat ingress accidents in ITER  

SciTech Connect

Accidents involving the ingress of air or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

1996-12-31

136

Safety Analysis Results for Cryostat Ingress Accidents in ITER  

NASA Astrophysics Data System (ADS)

Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

Merrill, B. J.; Cadwallader, L. C.; Petti, D. A.

1997-06-01

137

Safety analysis results for cryostat ingress accidents in ITER  

SciTech Connect

Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits. 6 refs., 2 figs., 1 tab.

Merrill, B.J.; Cadwallader, L.C.; Petti, D.A. [Idaho National Engineering Lab., ID (United States)] [Idaho National Engineering Lab., ID (United States)

1997-06-01

138

CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL  

SciTech Connect

Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask-loading-specific conditions could be performed to demonstrate that release is within the allowable leak rates of the cask.

Vinson, D.

2010-07-11

139

Integrated methodology for thermal-hydraulics uncertainty analysis (IMTHUA)  

NASA Astrophysics Data System (ADS)

This dissertation describes a new integrated uncertainty analysis methodology for "best estimate" thermal hydraulics (TH) codes such as RELAP5. The main thrust of the methodology is to utilize all available types of data and information in an effective way to identify important sources of uncertainty and to assess the magnitude of their impact on the uncertainty of the TH code output measures. The proposed methodology is fully quantitative and uses the Bayesian approach for quantifying the uncertainties in the predictions of TH codes. The methodology also uses the data and information for a more informed and evidence-based ranking and selection of TH phenomena through a modified PIRT method. The modification considers importance of various TH phenomena as well as their uncertainty importance. In identifying and assessing uncertainties, the proposed methodology treats the TH code as a white box, thus explicitly treating internal sub-model uncertainties, and propagation of such model uncertainties through the code structure as well as various input parameters. A The TH code output is further corrected through a Bayesian updating with available experimental data from integrated test facilities. It utilizes the data directly or indirectly related to the code output to account implicitly for missed/screened out sources of uncertainties. The proposed methodology uses an efficient Monte Carlo sampling technique for the propagation of uncertainty using modified Wilks sampling criteria. The methodology is demonstrated on the LOFT facility for 200% cold leg LBLOCA transient scenario.

Pourgol-Mohammad, Mohammad

140

Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence  

NASA Technical Reports Server (NTRS)

Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

2004-01-01

141

Analysis of Kuosheng Station Blackout Accident Using MELCOR 1.8.4  

SciTech Connect

The MELCOR code, developed by Sandia National Laboratories, is a fully integrated, relatively fast-running code that models the progression of severe accidents in commercial light water nuclear power plants (NPPs).A specific station blackout (SBO) accident for Kuosheng (BWR-6) NPP is simulated using the MELCOR 1.8.4 code. The MELCOR input deck for Kuosheng NPP is established based on Kuosheng NPP design data and the MELCOR users' guides. The initial steady-state conditions are generated with a developed self-initialization algorithm. The main severe accident phenomena and the fission product release fractions associated with the SBO accident were simulated. The predicted results are plausible and as expected in light of current understanding of severe accident phenomena. The uncertainty of this analysis is briefly discussed. The important features of the MELCOR 1.8.4 are described. The estimated results provide useful information for the probabilistic risk assessment (PRA) of Kuosheng NPP. This tool will be applied to the PRA, the severe accident analysis, and the severe accident management study of Kuosheng NPP in the near future.

Wang, S.-J.; Chien, C.-S.; Wang, T.-C.; Chiang, K.-S

2000-11-15

142

Analysis of traffic accidents on rural highways using Latent Class Clustering and Bayesian Networks.  

PubMed

One of the principal objectives of traffic accident analyses is to identify key factors that affect the severity of an accident. However, with the presence of heterogeneity in the raw data used, the analysis of traffic accidents becomes difficult. In this paper, Latent Class Cluster (LCC) is used as a preliminary tool for segmentation of 3229 accidents on rural highways in Granada (Spain) between 2005 and 2008. Next, Bayesian Networks (BNs) are used to identify the main factors involved in accident severity for both, the entire database (EDB) and the clusters previously obtained by LCC. The results of these cluster-based analyses are compared with the results of a full-data analysis. The results show that the combined use of both techniques is very interesting as it reveals further information that would not have been obtained without prior segmentation of the data. BN inference is used to obtain the variables that best identify accidents with killed or seriously injured. Accident type and sight distance have been identify in all the cases analysed; other variables such as time, occupant involved or age are identified in EDB and only in one cluster; whereas variables vehicles involved, number of injuries, atmospheric factors, pavement markings and pavement width are identified only in one cluster. PMID:23182777

de Oña, Juan; López, Griselda; Mujalli, Randa; Calvo, Francisco J

2013-03-01

143

Sensitivity analysis of radionuclides atmospheric dispersion following the Fukushima accident  

NASA Astrophysics Data System (ADS)

Atmospheric dispersion models are used in response to accidental releases with two purposes: - minimising the population exposure during the accident; - complementing field measurements for the assessment of short and long term environmental and sanitary impacts. The predictions of these models are subject to considerable uncertainties of various origins. Notably, input data, such as meteorological fields or estimations of emitted quantities as function of time, are highly uncertain. The case studied here is the atmospheric release of radionuclides following the Fukushima Daiichi disaster. The model used in this study is Polyphemus/Polair3D, from which derives IRSN's operational long distance atmospheric dispersion model ldX. A sensitivity analysis was conducted in order to estimate the relative importance of a set of identified uncertainty sources. The complexity of this task was increased by four characteristics shared by most environmental models: - high dimensional inputs; - correlated inputs or inputs with complex structures; - high dimensional output; - multiplicity of purposes that require sophisticated and non-systematic post-processing of the output. The sensitivities of a set of outputs were estimated with the Morris screening method. The input ranking was highly dependent on the considered output. Yet, a few variables, such as horizontal diffusion coefficient or clouds thickness, were found to have a weak influence on most of them and could be discarded from further studies. The sensitivity analysis procedure was also applied to indicators of the model performance computed on a set of gamma dose rates observations. This original approach is of particular interest since observations could be used later to calibrate the input variables probability distributions. Indeed, only the variables that are influential on performance scores are likely to allow for calibration. An indicator based on emission peaks time matching was elaborated in order to complement classical statistical scores which were dominated by deposit dose rates and almost insensitive to lower atmosphere dose rates. The substantial sensitivity of these performance indicators is auspicious for future calibration attempts and indicates that the simple perturbations used here may be sufficient to represent an essential part of the overall uncertainty.

Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien

2014-05-01

144

Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots  

Microsoft Academic Search

Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from

Steve Jarvis; Don Harris

2009-01-01

145

Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots  

Microsoft Academic Search

Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from

Steve Jarvis; Don Harris

2010-01-01

146

BESAFE II: Accident safety analysis code for MFE reactor designs  

Microsoft Academic Search

The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant

Lawrence Michael Sevigny

1997-01-01

147

A Global Sensitivity Analysis Methodology for Multi-physics Applications  

SciTech Connect

Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

Tong, C H; Graziani, F R

2007-02-02

148

NMR methodologies in the analysis of blueberries.  

PubMed

An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-?-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering. PMID:24668393

Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

2014-06-01

149

Integrated Analysis of Mechanical and Thermal Hydraulic Behavior of Graphite Stack in Channel-Type Reactors in Case of a Fuel Channel Rupture Accident  

SciTech Connect

The paper discusses the methodology and a computational exercise analyzing the processes taking place in the graphite stack of an RBMK reactor in case of a pressure tube rupture caused by overheating. The methodology of the computational analysis is implemented in integrated code U{sub S}TACK which models thermal-hydraulic and mechanical processes in the stack with a varying geometry, coupled with the processes going on in the circulation loop and accident localization (confinement) system. Coolant parameters, cladding and pressure tube temperatures, pressure tube ballooning and rupture, coolant outflow are calculated for a given accident scenario. Fluid parameters, movement of graphite blocks and adjacent pressure tubes bending after the tube rupture are calculated for the whole volume of the core. Calculations also cover additional loads on adjacent fuel channels in the rupture zone, reactor shell, upper and lower plates. Impossibility of an induced pressure tube rupture is confirmed. (authors)

Soloviev, Sergei L. [MINATOM, Moscow (Russian Federation); Gabaraev, Boris A.; Novoselsky, Oleg Yu.; Filinov, Vladimir N. [Research and Development Institute of Power Engineering, M. Krasnoselskaya ul., build. 2/8, 107140 Moscow (Russian Federation); Parafilo, Leonid M.; Kruchkov, Dmitry V. [Institute of Physics and Power Engineering, 1 Bondarenko sq., RU-249020 Obninsk Kaluga Region (Russian Federation); Melikhov, Oleg I. [Electrogorsk Research and Engineering Center, Saint Constantine st., 6, Electrogorsk, Moscow Region, 142530 (Russian Federation)

2002-07-01

150

‘Doing’ health policy analysis: methodological and conceptual reflections and challenges  

PubMed Central

The case for undertaking policy analysis has been made by a number of scholars and practitioners. However, there has been much less attention given to how to do policy analysis, what research designs, theories or methods best inform policy analysis. This paper begins by looking at the health policy environment, and some of the challenges to researching this highly complex phenomenon. It focuses on research in middle and low income countries, drawing on some of the frameworks and theories, methodologies and designs that can be used in health policy analysis, giving examples from recent studies. The implications of case studies and of temporality in research design are explored. Attention is drawn to the roles of the policy researcher and the importance of reflexivity and researcher positionality in the research process. The final section explores ways of advancing the field of health policy analysis with recommendations on theory, methodology and researcher reflexivity. PMID:18701552

Walt, Gill; Shiffman, Jeremy; Schneider, Helen; Murray, Susan F; Brugha, Ruairi; Gilson, Lucy

2008-01-01

151

Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion  

SciTech Connect

Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

2012-09-30

152

RBMK Safety Analysis in Accidents Initiated by Partial Ruptures of the Circulation Circuit  

SciTech Connect

The paper gives an analysis of the current state of the RBMK safety evaluation in accidents initiated by partial ruptures of the delivery part of the circulating loop. It appears from this analysis that applicability and uncertainty of the international code RELAP for RBMK safety analysis could not be determined up to the present. At the same time it is shown in the paper that fuel rod cladding temperature can reach the acceptability criterion in the accidents. As a result it has been concluded that bases of the next stage of the RBMK safety analysis would be creation of a code oriented to the special features of a reactor RBMK. (authors)

Dostov, Anatoly I.; Kramerov, Alexander Ja. [Russian Research Center - RRC, Kurchatov Institute Kurchatov Square, 46 Moscow 123182 (Russian Federation)

2002-07-01

153

THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT  

SciTech Connect

Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

Gupta, N.

2011-02-14

154

Landscape Equivalency Analysis: Methodology for Estimating Spatially Explicit Biodiversity Credits  

E-print Network

the potential to contribute significantly to biodiversity conservation or loss. Roughly 80% of endangered re- ferred to as sprawl) has been shown to be a better predictor of biodiversity loss than the rateLandscape Equivalency Analysis: Methodology for Estimating Spatially Explicit Biodiversity Credits

Lupi, Frank

155

STATUS OF PRESSURIZED WATER REACTOR REACTIVITY ACCIDENT ANALYSIS  

Microsoft Academic Search

The construction of pressurized water reactors with large excess ;\\u000a reactivities has made possible accidents with fast power excursions. A knowledge ;\\u000a of the fastest power transient that a pressurized water reactor is capable of ;\\u000a withstanding without destnactive effects is an important factor in its safety ;\\u000a aspects. Possible methods of attack are defined for three categories of ;

L. Geller; J. Cagnetta; R. Dobbs

1958-01-01

156

Application of RELAP/SCDAPSIM and COCOSYS Codes for Severe Accident Analysis in RBMK-1500 Reactor  

SciTech Connect

Regardless low probability of occurrence the severe accident phenomena are investigated for all types of nuclear reactors in the world because the consequences of such accident could be catastrophic. Most of research is performed for the prevailing vessel-type light water reactors like PWRs and BWRs. Less research is performed for the channel-type reactors like CANDUs and RBMKs as they are operated just in a few countries. Up to now the phenomena that could occur in case of a severe accident in RBMK reactors were not analysed in detail and little literature is available on this topic. The paper presents one of the first integrated analyses of severe accident in RBMK-1500 reactor. RELAP/SCDAPSIM code is used to simulate the phenomena in the reactor core and reactor cooling system and COCOSYS code is used to simulate the confinement phenomena during the same accident scenario. The performed analysis provided information regarding code acceptability for the severe accident analysis in RBMK reactor and assessment of the timing of the key events, i.e. core uncover, fuel cladding rupture, etc, and provided assessment regarding hydrogen distribution in confinement. (authors)

Urbonavicius, E.; Uspuras, E.; Rimkevicius, S.; Kaliatka, A. [Lithuanian Energy Institute, Breslaujos g. 3, LT-44403 Kaunas (Lithuania)

2006-07-01

157

Siting MSW landfills with a spatial multiple criteria analysis methodology.  

PubMed

The present work describes a spatial methodology which comprises several methods from different scientific fields such as multiple criteria analysis, geographic information systems, spatial analysis and spatial statistics. The final goal of the methodology is to evaluate the suitability of the study region in order to optimally site a landfill. The initial step is the formation of the multiple criteria problem's hierarchical structure. Then the methodology utilizes spatial analysis processes to create the evaluation criteria, which are mainly based on Greek and EU legislation, but are also based on international practice and practical guidelines. The relative importance weights of the evaluation criteria are estimated using the analytic hierarchy process. With the aid of the simple additive weighting method, the suitability for landfill siting of the study region is finally evaluated. The resulting land suitability is reported on a grading scale of 0-10, which is, respectively, from least to most suitable areas. The last step is a spatial clustering process, which is being performed in order to reveal the most suitable areas, allowing an initial ranking and selection of candidate landfill sites. The application of the presented methodology in the island of Lemnos in the North Aegean Sea (Greece) indicated that 9.3% of the study region is suitable for landfill siting with grading values greater than 9. PMID:15946837

Kontos, Themistoklis D; Komilis, Dimitrios P; Halvadakis, Constantinos P

2005-01-01

158

Two methodologies for optical analysis of contaminated engine lubricants  

NASA Astrophysics Data System (ADS)

The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant-object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function, coherence function, etc) are used for the analysis of combined object-lubricant images. Both proposed methodologies utilize the comparison of measured parameters and calculated object shape-based and statistical characteristics for fresh and contaminated lubricants. Developed methodologies are verified experimentally showing an ability to distinguish lubricant with 0%, 3%, 7% and 10% water and coolant contamination. This proves the potential applicability of the developed methodologies for on-line measurement, monitoring and control of the engine lubricant condition.

Aghayan, Hamid; Bordatchev, Evgueni; Yang, Jun

2012-01-01

159

Intelligent signal analysis methodologies for nuclear detection, identification and attribution  

NASA Astrophysics Data System (ADS)

Detection and identification of special nuclear materials can be fully performed with a radiation detector-spectrometer. Due to several physical and computational limitations, development of fast and accurate radioisotope identifier (RIID) algorithms is essential for automated radioactive source detection and characterization. The challenge is to identify individual isotope signatures embedded in spectral signature aggregation. In addition, background and isotope spectra overlap to further complicate the signal analysis. These concerns are addressed, in this thesis, through a set of intelligent methodologies recognizing signature spectra, background spectrum and, subsequently, identifying radionuclides. Initially, a method for detection and extraction of signature patterns is accomplished by means of fuzzy logic. The fuzzy logic methodology is applied on three types of radiation signal processing applications, where it exhibits high positive detection, low false alarm rate and very short execution time, while outperforming the maximum likelihood fitting approach. In addition, an innovative Pareto optimal multiobjective fitting of gamma ray spectra using evolutionary computing is presented. The methodology exhibits perfect identification while performs better than single objective fitting. Lastly, an innovative kernel based machine learning methodology was developed for estimating natural background spectrum in gamma ray spectra. The novelty of the methodology lies in the fact that it implements a data based approach and does not require any explicit physics modeling. Results show that kernel based method adequately estimates the gamma background, but algorithm's performance exhibits a strong dependence on the selected kernel.

Alamaniotis, Miltiadis

160

Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

1996-12-01

161

Calculation notes in support of TWRS FSAR spray leak accident analysis  

Microsoft Academic Search

This document contains the detailed calculations that support the spray leak accident analysis in the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The consequence analyses in this document form the basis for the selection of controls to mitigate or prevent spray leaks throughout TWRS. Pressurized spray leaks can occur due to a breach in containment barriers along

Hall

1996-01-01

162

Radioactivity analysis following the Fukushima Dai-ichi nuclear accident.  

PubMed

A total of 118 samples were analyzed using HPGe ?-spectrometry. (131)I, (134)Cs, (137)Cs and (136)Cs were detected in aerosol air samples that were collected 22 days after the accident with values of 1720 µBq m(-)³, 247 µBq m(-)³, 289 µBq m(-)³ and 23 µBq m(-)³, respectively. (131)I was detected in rainwater and soil samples and was also measurable in vegetables collected between April 2 and 13, 2011, with values ranging from 0.55 Bq kg(-1) to 2.68 Bq kg(-1). No (131)I was detected in milk, drinking water, seawater or marine biota samples. PMID:23685724

Tuo, Fei; Xu, Cuihua; Zhang, Jing; Zhou, Qiang; Li, Wenhong; Zhao, Li; Zhang, Qing; Zhang, Jianfeng; Su, Xu

2013-08-01

163

Methodological aspects of exhaled breath condensate collection and analysis.  

PubMed

The collection and analysis of exhaled breath condensate (EBC) may be useful for the management of patients with chronic respiratory disease at all ages. It is a promising technique due to its apparent simplicity and non-invasiveness. EBC does not disturb an ongoing respiratory inflammation. However, the methodology remains controversial, as it is not yet standardized. The current diversity of the methods used to collect and preserve EBC, the analytical pitfalls and the high degree of within-subject variability are the main issues that hamper further development into a clinical useful technique. In order to facilitate the process of standardization, a simplified schematic approach is proposed. An update of available data identified open issues on EBC methodology. These issues were then classified into three separate conditions related to their influence before, during or after the condensation process: (1) pre-condenser conditions related to subject and/or environment; (2) condenser conditions related to condenser equipment; and (3) post-condenser conditions related to preservation and/or analysis. This simplified methodological approach highlights the potential influence of the many techniques used before, during and after condensation of exhaled breath. It may also serve as a methodological checklist for a more systematical approach of EBC research and development. PMID:22522968

Rosias, Philippe

2012-06-01

164

[Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].  

PubMed

The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events. PMID:25388176

Almeida, Ildeberto Muniz de; Vilela, Rodolfo Andrade de Gouveia; Silva, Alessandro José Nunes da; Beltran, Sandra Lorena

2014-12-01

165

Preliminary accident analysis to support a passive depressurization systems design  

SciTech Connect

The new generation of evolutionary nuclear power plants, e.g., the Westinghouse AP600 and the General Electric simplified boiling water reactor, relies on a full reactor coolant system (RCS) depressurization to allow gravity injection from an in-containment tank and thereby assure long-term core cooling. Studies performed to support the licensing process and design of both evolutionary and innovative reactors have shown that cold water injection may, under particular plant conditions, induce a large plant depressurization. Preliminary studies have been performed to support the design of a passive injection and depressurization system (PIDS) based on the idea of depressurizing the RCS by mixing cold water with the RCS hot water and inducing steam condensation in the primary system. The analyses, performed with the RELAP5/MOD3 computer code, show the response of a typical midsize pressurized water reactor plant [two loops, 600 MW (electric)] equipped with the PIDS. Different RCS injection locations including pressurizer, vessel upper head, and hot leg, and actuation at different residual reactor coolant masses have been investigated. The PIDS performance has also been verified against the following reference severe accident scenarios: (a) complete station blackout event, and (b) a small-break loss-of-coolant accident and concomitant station blackout event.

Lenti, R.; Mansani, L.; Saiu, G. [Ansaldo, Genoa (Italy). Nuclear Div.

1996-05-01

166

How Root Cause Analysis Can Improve the Value Methodology  

SciTech Connect

Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

Wixson, James Robert

2002-05-01

167

How Root Cause Analysis Can Improve the Value Methodology  

SciTech Connect

Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

Wixson, J. R.

2002-02-05

168

RELAP5 Application to Accident Analysis of the NIST Research Reactor  

SciTech Connect

Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

2012-03-18

169

Risk analysis of releases from accidents during mid-loop operation at Surry  

SciTech Connect

Studies and operating experience suggest that the risk of severe accidents during low power operation and/or shutdown (LP/S) conditions could be a significant fraction of the risk at full power operation. Two studies have begun at the Nuclear Regulatory Commission (NRC) to evaluate the severe accident progression from a risk perspective during these conditions: One at the Brookhaven National Laboratory for the Surry plant, a pressurized water reactor (PWR), and the other at the Sandia National Laboratories for the Grand Gulf plant, a boiling water reactor (BWR). Each of the studies consists of three linked, but distinct, components: a Level I probabilistic risk analysis (PRA) of the initiating events, systems analysis, and accident sequences leading to core damage; a Level 2/3 analysis of accident progression, fuel damage, releases, containment performance, source term and consequences-off-site and on-site; and a detailed Human Reliability Analysis (HRA) of actions relevant to plant conditions during LP/S operations. This paper summarizes the approach taken for the Level 2/3 analysis at Surry and provides preliminary results on the risk of releases and consequences for one plant operating state, mid-loop operation, during shutdown.

Jo, J.; Lin, C.C.; Nimnual, S.; Mubayi, V.; Neymotin, L.

1992-11-01

170

Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept  

SciTech Connect

The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

Ricci, E.; McLean, R.B.

1988-09-01

171

TRAC-P1A developmental assessment. [Thermal-hydraulic analysis of LWR accidents  

Microsoft Academic Search

The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced best-estimate predictive capability for the analysis of postulated accidents in light water reactors. TRAC-P1A provides this analysis capability for pressurized water reactors and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel

J. C. Vigil; K. A. Williams

1979-01-01

172

Towards a Methodology for Identifying Program Constraints During Requirements Analysis  

NASA Technical Reports Server (NTRS)

Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

1997-01-01

173

Fluid-structure interaction analysis of a hypothetical core disruptive accident in LMFBRs  

Microsoft Academic Search

To ensure safety, it is necessary to assess the integrity of a reactor vessel of liquid-metal fast breeder reactor (LMFBR) under HCDA. Several important problems for a fluid-structural interaction analysis of HCDA are discussed in the present paper. Various loading models of hypothetical core disruptive accident (HCDA) are compared and the polytropic processes of idea gas (PPIG) law is recommended.

Chuang Liu; Xiong Zhang; Ming-Wan Lu

2005-01-01

174

Why-Because Analysis of the Glenbrook, NSW Rail Accident and Comparison with Hopkins's Accimap  

E-print Network

Why-Because Analysis of the Glenbrook, NSW Rail Accident and Comparison with Hopkins's Accimap electronic archive, Pandora, at http://pandora.nla.gov.au/tep/47325 There had been a failure of a power "halt" (the fail-safe position, instigated by the track-circuit system failure). The Indian Pacific

Ladkin, Peter B.

175

Source term analysis for a criticality accident in metal production line glove boxes  

Microsoft Academic Search

A recent development in criticality accident analysis is the deterministic calculations of the transport of fission products and actinides through the barriers of the physical facility. The knowledge of the redistribution of the materials inside the facility will help determine the reentry and clean-up procedures. The amount of radioactive materials released to the environment is the source term for dispersion

1991-01-01

176

Analysis of the hydrolgeological consequences of hypothetical hazardous accidents at nuclear reactors  

Microsoft Academic Search

The major idea of this analysis is to incorporate the different models and approaches separately developed in nuclear technology and hydroscience for predicting the influence of hazardous accidents at nuclear reactors on groundwater quality. For this purpose the possible channels for radioactivity releases out of nuclear power plants were considered and so allowed one to estimate the boundary conditions for

V. A. MIRONENKO; V. G. RUMYNIN

1994-01-01

177

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

178

Analysis of Reactivity Induced Accident for Control Rods Ejection with Loss of Cooling  

E-print Network

Understanding of the time-dependent behavior of the neutron population in nuclear reactor in response to either a planned or unplanned change in the reactor conditions, is a great importance to the safe and reliable operation of the reactor. In the present work, the point kinetics equations are solved numerically using stiffness confinement method (SCM). The solution is applied to the kinetics equations in the presence of different types of reactivities and is compared with different analytical solutions. This method is also used to analyze reactivity induced accidents in two reactors. The first reactor is fueled by uranium and the second is fueled by plutonium. This analysis presents the effect of negative temperature feedback with the addition positive reactivity of control rods to overcome the occurrence of control rod ejection accident and damaging of the reactor. Both power and temperature pulse following the reactivity- initiated accidents are calculated. The results are compared with previous works and satisfactory agreement is found.

Hend Mohammed El Sayed Saad; Hesham Mohammed Mohammed Mansour; Moustafa Aziz Abd El Wahab

2013-06-05

179

Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices  

SciTech Connect

This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

Not Available

1988-12-15

180

76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year  

Federal Register 2010, 2011, 2012, 2013

...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY...revision of the Federal Need Analysis Methodology for the 2012-2013 award year...84.379]. Federal Need Analysis Methodology for the 2012-2013 award year;...

2011-05-24

181

Two non-probabilistic methods for uncertainty analysis in accident reconstruction.  

PubMed

There are many uncertain factors in traffic accidents, it is necessary to study the influence of these uncertain factors to improve the accuracy and confidence of accident reconstruction results. It is difficult to evaluate the uncertainty of calculation results if the expression of the reconstruction model is implicit and/or the distributions of the independent variables are unknown. Based on interval mathematics, convex models and design of experiment, two non-probabilistic methods were proposed. These two methods are efficient under conditions where existing uncertainty analysis methods can hardly work because the accident reconstruction model is implicit and/or the distributions of independent variables are unknown; and parameter sensitivity can be obtained from them too. An accident case is investigated by the methods proposed in the paper. Results show that the convex models method is the most conservative method, and the solution of interval analysis method is very close to the other methods. These two methods are a beneficial supplement to the existing uncertainty analysis methods. PMID:20207512

Zou, Tiefang; Yu, Zhi; Cai, Ming; Liu, Jike

2010-05-20

182

SOCRAT: The System of Codes for Realistic Analysis of Severe Accidents  

SciTech Connect

For a long time in the Russian Federation the computer code for analysis of severe accidents is being developed. The main peculiarity of this code from the known computer codes for analysis of severe accidents at NPP such as MELCOR and ASTEC, is a consequent realization of the mechanistic approach for modeling of the melt progression processes, including beyond design basis accidents with the severe core damage. The motivation of the development is defined by the new design requirements to the safety of nuclear power plants with the improved economic factors, by the modernization of existing NPPs, by the development of instructions to the accident management and emergency planning. The realistic assessments of Nuclear power plants safety require usage of the best estimate codes allowing description of the melt progression processes accompanying severe accident at the nuclear installation and behavior of the containment under abnormal condition (in particular, rates of the steam and hydrogen release, relocation of molten materials to the concrete cavity after failure of the reactor vessel). The developed computer codes were used for the safety justification of NPP with the new generation of VVER type reactor such as Tyanvan NPP in China and Kudamkulam NPP in India. In particular using this code system the justification of the system for hydrogen safety, analysis of core degradation and relocation of the molten core to the core catcher used for the guarantied localization of the melt and prevention of the ex-vessel melt progression. The considered system of codes got recently name SOCRAT provides the self consistent analysis of in-vessel processes and processes, running in the containment, including melt localization device. In the paper the structure of the computer code SOCRAT is presented, functionality of the separate parts of the code is described, results of verification of different models of the code are also considered. (authors)

Bolshov, Leonid; Strizhov, Valery [Nuclear Safety Institute, Russian Academy of Sciences, B. Tulskaya, 52 Moscow, 115191 (Russian Federation)

2006-07-01

183

Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots.  

PubMed

Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from 2002 to 2006 for their overall causes as well as factors specific to low hours pilots. Fifty-nine categories of pilot-related accident causation emerged, which were formed into progressively larger categories until four overall human factors groups were arrived at: 'judgement'; 'handling'; 'strategy'; 'attention'. 'Handling' accounted for a significantly higher proportion of injuries than other categories. Inexperienced pilots had considerably more accidents in all categories except 'strategy'. Approach control (path judgement, airbrake and speed handling) as well as landing flare misjudgement were chiefly responsible for the high accident rate in early solo glider pilots. PMID:19629815

Jarvis, Steve; Harris, Don

2009-08-01

184

SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES  

SciTech Connect

Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

Coutts, D

2007-04-17

185

SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident  

NASA Astrophysics Data System (ADS)

On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

2014-06-01

186

Analysis of Severe Accident Scenarios in APR-1400 Using the MAAP4 Code  

SciTech Connect

An analysis of containment environments during a postulated severe accident was performed for a pressurized water reactor (PWR) type nuclear power plant (NPP) of which electric power is 1400 MWe. Examined were four initiating events such as large break loss-of-coolant accident (LBLOCA), small break loss-of-coolant accident (SBLOCA), total loss of feedwater (TLOFW) and station blackout (SBO). These events are selected based on their risk dominance. Accident progression is divided into four phases in accordance with phenomena occurring in reactor and containment. Several scenarios were established in order to get most severe conditions in each phase. More than a dozen scenarios were analyzed in the present analysis and 10 parameters were closely examined such as maximum core temperature, gas temperatures at core exit and hot-leg, pressure and temperature of pressurizer (PZR), pressure, temperature and hydrogen concentration of each compartment of containment building, in containment refueling water storage tank (IRWST) level and gas temperature in reactor cavity. The maximum temperature and hydrogen concentration were found to vary in accordance with initiating events and compartment locations. (authors)

Jeong, Ji Hwan [Dept of Environmental System, Cheonan College of Foreign Studies, Anseo-dong, Cheonan, Choongnam, 330-705 (Korea, Republic of); Na, M.G.; Kim, S.P. [Dept. of Nuclear Engineering, Chosun University, Susuk-dong, Dong-gu, Gwangju, 501-825 (Korea, Republic of); Park, Jong Woon [Korea Electric Power Research Institute, Moonji-dong, Yusong-gu, Taejon, 305-380 (Korea, Republic of)

2002-07-01

187

77 FR 31600 - Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal Pell Grant, Federal...  

Federal Register 2010, 2011, 2012, 2013

...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal...statutory ``Federal Need Analysis Methodology'' to determine a student's expected...tables used in the Federal Need Analysis Methodology EFC calculations. Section 478 of...

2012-05-29

188

Thermal-hydraulic analysis of accidents leading to local coolant flow decrease in the main circulation circuit of RBMK-1500  

Microsoft Academic Search

There are a few transient and loss-of-coolant accident conditions in RBMK-1500 reactors that lead to a local flow decrease in fuel channels. Because the coolant flow decreases in fuel channels (FC) leads to overheating of fuel claddings and pressure tube walls, mitigation measures are necessary. The accident analysis enabled the suggestion of the new early reactor scram actuation and emergency

A Kaliatka; E Uspuras

2002-01-01

189

Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [GRAM, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

190

Overview of Sandia National Laboratories and Khlopin Radium Institute collaborative radiological accident consequence analysis efforts  

SciTech Connect

In January, 1995 a collaborative effort to improve radiological consequence analysis methods and tools was initiated between the V.G. Khlopin Institute (KRI) and Sandia National Laboratories (SNL). The purpose of the collaborative effort was to transfer SNL`s consequence analysis methods to KRI and identify opportunities for collaborative efforts to solve mutual problems relating to the safety of radiochemical facilities. A second purpose was to improve SNL`s consequence analysis methods by incorporating the radiological accident field experience of KRI scientists (e.g. the Chernobyl and Kyshtym accidents). The initial collaborative effort focused on the identification of: safety criteria that radiochemical facilities in Russia must meet; analyses/measures required to demonstrate that safety criteria have been met; and data required to complete the analyses/measures identified to demonstrate the safety basis of a facility.

Young, M.L.; Carlson, D.D. [Sandia National Labs., Albuquerque, NM (United States); Lazarev, L.N.; Petrov, B.F.; Romanovskiy, V.N. [V.G. Khlopin Radium Inst., St. Petersburg (Russian Federation)

1997-05-01

191

A New Methodology for Frequency Domain Analysis of Wave Energy Converters with Periodically Varying Physical Parameters  

E-print Network

A New Methodology for Frequency Domain Analysis of Wave Energy Converters with Periodically Varying Methodology for Frequency Domain Analysis of Wave Energy Converters with Periodically Varying Physical; the methodology allows large number of design iterations, including both physical design and control variables

Victoria, University of

192

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H  

SciTech Connect

This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

193

[Analysis of radiation-hygienic and medical consequences of the Chernobyl accident].  

PubMed

Since the day of "the Chernobyl accident" in 1986 more than 25 years have been past. Radioactively contaminated areas 14 subjects of the Russian Federation with a total area of more than 50 thousand km2, where 1.5 million people now reside were exposed to radioactive contamination. Currently, a system of comprehensive evaluation of radiation doses of the population affected by the "Chernobyl accidents", including 11 guidance documents has been created. There are methodically provided works on the assessment of average annual, accumulated and predicted radiation doses of population and its critical groups, as well as doses to the thyroid gland The relevance of the analysis of the consequences of the "Chernobyl accident" is demonstrated by the events in Japan, at nuclear power Fukusima-1. In 2011 - 20/2 there were carried out comprehensive maritime expeditions under the auspices of the Russian Geographical Society with the participation of relevant ministries and agencies, leading academic institutions in Russia. In 2012, work was carried out on radiation protection of the population from the potential transboundary impact of the accident at the Japanese nuclear power plant Fukushima-l. The results provide a basis for the favorable outlook for the radiation environment in our Far East and the Pacific coast of Russia. PMID:24340594

Onishchenko, G G

2013-01-01

194

Development of an engineering methodology for thermal analysis of protected structural members in fire   

E-print Network

In order to overcome the limitations of existing methodologies for thermal analysis of protected structural members in fire, a novel CFD-based methodology has been developed. This is a generalised quasi- 3D approach with ...

Liang, Hong; Welch, Stephen

195

Cross-database analysis to identify relationships between aircraft accidents and incidents  

Microsoft Academic Search

Air transportation systems are designed to ensure that aircraft accidents are rare events. To minimize these accidents, factors causing or contributing to accidents must be understood and prevented. Despite many efforts by the aviation safety community to reduce the accidents, accident rates have been stable for decades. One explanation could be that direct and obvious causes that previously occurred with

Zohreh Nazeri

2007-01-01

196

A flammability and combustion model for integrated accident analysis. [Advanced light water reactors  

SciTech Connect

A model for flammability characteristics and combustion of hydrogen and carbon monoxide mixtures is presented for application to severe accident analysis of Advanced Light Water Reactors (ALWR's). Flammability of general mixtures for thermodynamic conditions anticipated during a severe accident is quantified with a new correlation technique applied to data for several fuel and inertant mixtures and using accepted methods for combining these data. Combustion behavior is quantified by a mechanistic model consisting of a continuity and momentum balance for the burned gases, and considering an uncertainty parameter to match the idealized process to experiment. Benchmarks against experiment demonstrate the validity of this approach for a single recommended value of the flame flux multiplier parameter. The models presented here are equally applicable to analysis of current LWR's. 21 refs., 16 figs., 6 tabs.

Plys, M.G.; Astleford, R.D.; Epstein, M. (Fauske and Associates, Inc., Burr Ridge, IL (USA))

1988-01-01

197

Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.  

PubMed

Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions as well as the causes of non-compliance with SMS. PMID:23764875

Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

2013-10-01

198

Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D. [GRAM, Inc., Albuquerque, NM (United States); McKay, M.D. [Los Alamos National Lab., NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

199

A new methodology for fractured reservoir analysis and modelling  

SciTech Connect

Dynamic modelling of fractured reservoirs involves a conceptual double-porosity medium equivalent to the real medium. Geometric modelling of the discrete fracture network has become a classical approach for deriving the parameters which characterize this equivalent double porosity medium. These models are often based on the statistical description of the fracture geometry, deduced from observations along wells and cores. Unfortunately, it is now obvious that fracturing is a highly non-stationary process. Paleo-stresses, paleo-strains and lithology variations at the scale of the reservoir govern the spatial distribution of the fracture network characteristics. We are developing a new methodology for studying fractured reservoirs, which is based on a deterministic prediction of these non-stationarities, constrained by sub-surface and outcrop data. It is supported by a new software in which: geostatistical stratigraphic modelling is used to predict possible locations of jointed units, horizon topography maps from 3D seismic data are used for curved space analysis, which gives information on the location and orientation of fold-related joints - fractal analysis is used to generate sub-seismic faults extrapolated from the network of major faults - geomechanical modelling and analogue modelling yield stress and strain history, which are interpreted in terms of fracture attributes. These approaches yield 3D maps of fracture attributes at the scale of the reservoir. They are used for zoning the reservoir, in terms of {open_quotes}fracture facies{close_quotes}. This methodology is being carried out on the Garreth El Gueffoul structure, in the Ahnet desert, Algeria.

Cacas, M.C.; Letouzey, J.

1995-08-01

200

Uncertainty Analysis of Accident Notification Time and Emergency Medical Service Response Time in Work Zone Traffic Accidents  

Microsoft Academic Search

Taking into account the uncertainty caused by exogenous factors, the accident notification time (ANT) and emergency medical service (EMS) response time are modeled as two random variables following the lognormal distribution. Their mean values and standard deviations are respectively formulated as the functions of environmental variables including crash time, road type, weekend, holiday, light condition, weather and work zone type.

Qiang Meng; Jinxian Weng

2012-01-01

201

Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [Gram, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

202

Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)  

SciTech Connect

This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

Whitehead, D. [Sandia National Labs., Albuquerque, NM (United States); Darby, J. [Science and Engineering Associates, Inc., Albuquerque, NM (United States); Yakle, J. [Science Applications International Corp., Albuquerque, NM (United States)] [and others

1994-06-01

203

Space Shuttle Columbia Post-Accident Analysis and Investigation  

NASA Technical Reports Server (NTRS)

Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

McDanels, Steven J.

2006-01-01

204

Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots.  

PubMed

Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from 2002 to 2006 for their overall causes as well as factors specific to low hours pilots. Fifty-nine categories of pilot-related accident causation emerged, which were formed into progressively larger categories until four overall human factors groups were arrived at: 'judgement'; 'handling'; 'strategy'; 'attention'. 'Handling' accounted for a significantly higher proportion of injuries than other categories. Inexperienced pilots had considerably more accidents in all categories except 'strategy'. Approach control (path judgement, airbrake and speed handling) as well as landing flare misjudgement were chiefly responsible for the high accident rate in early solo glider pilots. Statement of Relevance: This paper uses extant accident data to produce a taxonomy of underlying human factors causes to analyse gliding accidents and identify the specific causes associated with low hours pilots. From this specific, well-targeted remedial measures can be identified. PMID:20099182

Jarvis, Steve; Harris, Don

2010-02-01

205

Impact of traffic congestion on road accidents: a spatial analysis of the M25 motorway in England.  

PubMed

Traffic congestion and road accidents are two external costs of transport and the reduction of their impacts is often one of the primary objectives for transport policy makers. The relationship between traffic congestion and road accidents however is not apparent and less studied. It is speculated that there may be an inverse relationship between traffic congestion and road accidents, and as such this poses a potential dilemma for transport policy makers. This study aims to explore the impact of traffic congestion on the frequency of road accidents using a spatial analysis approach, while controlling for other relevant factors that may affect road accidents. The M25 London orbital motorway, divided into 70 segments, was chosen to conduct this study and relevant data on road accidents, traffic and road characteristics were collected. A robust technique has been developed to map M25 accidents onto its segments. Since existing studies have often used a proxy to measure the level of congestion, this study has employed a precise congestion measurement. A series of Poisson based non-spatial (such as Poisson-lognormal and Poisson-gamma) and spatial (Poisson-lognormal with conditional autoregressive priors) models have been used to account for the effects of both heterogeneity and spatial correlation. The results suggest that traffic congestion has little or no impact on the frequency of road accidents on the M25 motorway. All other relevant factors have provided results consistent with existing studies. PMID:19540969

Wang, Chao; Quddus, Mohammed A; Ison, Stephen G

2009-07-01

206

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands); Grupa, J.B. [Netherlands Energy Research Foundation (Netherlands)

1997-12-01

207

Preliminary analysis of graphite dust releasing behavior in accident for HTR  

SciTech Connect

The behavior of the graphite dust is important to the safety of High Temperature Gas-cooled Reactors. This study investigated the flow of graphite dust in helium mainstream. The analysis of the stresses acting on the graphite dust indicated that gas drag played the absolute leading role. Based on the understanding of the importance of gas drag, an experimental system is set up for the research of dust releasing behavior in accident. Air driven by centrifugal fan is used as the working fluid instead of helium because helium is expensive, easy to leak which make it difficult to seal. The graphite particles, with the size distribution same as in HTR, are added to the experiment loop. The graphite dust releasing behavior at the loss-of-coolant accident will be investigated by a sonic nozzle. (authors)

Peng, W.; Yang, X. Y.; Yu, S. Y.; Wang, J. [Inst. of Nuclear and New Energy Technology, Tsinghua Univ., Beijing100084 (China)

2012-07-01

208

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1998-04-01

209

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

210

Loss-of-coolant-accident analysis for the Ignalina RBMK-1500 Nuclear Power Plant  

SciTech Connect

A loss-of-coolant-accident (LOCA) analysis has been completed for the Ignalina nuclear power plant (INPP) located in northeastern Lithuania near the borders of Latvia and Belorus. The INPP site has two RBMK-1500 reactors; the RBMK-1500 is a boiling water, graphite-moderated, pressure tube reactor with the capability of producing up to 4800 MW(thermal). Currently, the power level of INPP is limited to 4200 MW(thermal); thus, the analysis results presented in this paper have been obtained for an initial power level of 4200 MW(thermal).

Klefbom, S. [Royal Inst. of Technology, Stockholm (Sweden); Shier, W. [Brookhaven National Laboratory, Upton, NY (United States)

1996-12-31

211

Calculation notes in support of TWRS FSAR spray leak accident analysis  

SciTech Connect

This document contains the detailed calculations that support the spray leak accident analysis in the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The consequence analyses in this document form the basis for the selection of controls to mitigate or prevent spray leaks throughout TWRS. Pressurized spray leaks can occur due to a breach in containment barriers along transfer routes, during waste transfers. Spray leaks are of particular safety concern because, depending on leak dimensions, and waste pressure, they can be relatively efficient generators of dispersible sized aerosols that can transport downwind to onsite and offsite receptors. Waste is transferred between storage tanks and between processing facilities and storage tanks in TWRS through a system of buried transfer lines. Pumps for transferring waste and jumpers and valves for rerouting waste are located inside below grade pits and structures that are normally covered. Pressurized spray leaks can emanate to the atmosphere due to breaches in waste transfer associated equipment inside these structures should the structures be uncovered at the time of the leak. Pressurized spray leaks can develop through holes or cracks in transfer piping, valve bodies or pump casings caused by such mechanisms as corrosion, erosion, thermal stress, or water hammer. Leaks through degraded valve packing, jumper gaskets, or pump seals can also result in pressurized spray releases. Mechanisms that can degrade seals, packing and gaskets include aging, radiation hardening, thermal stress, etc. An1782other common cause for spray leaks inside transfer enclosures are misaligned jumpers caused by human error. A spray leak inside a DST valve pit during a transfer of aging waste was selected as the bounding, representative accident for detailed analysis. Sections 2 through 5 below develop this representative accident using the DOE- STD-3009 format. Sections 2 describes the unmitigated and mitigated accident scenarios evaluated to determine the need for safety class SSCs or TSR controls. Section 3 develops the source terms associated with the unmitigated and mitigated accident scenarios. Section 4 estimates the radiological and toxicological consequences for the unmitigated and mitigated scenarios. Section 5 compares the radiological and toxicological consequences against the TWRS evaluation guidelines. Section 6 extrapolates from the representative accident case to other represented spray leak sites to assess the conservatism in using the representative case to define controls for other postulated spray leak sites throughout TWRS. Section 7 discusses the sensitivities of the consequence analyses to the key parameters and assumptions used in the analyses. Conclusions are drawn in Section 8. The analyses herein pertain to spray leaks initiated due to internal mechanisms (e.g., corrosion, erosion, thermal stress, etc). External initiators of spray leaks (e.g., excavation accidents), and natural phenomena initiators (e.g., seismic events) are to be covered in separate accident analyses.

Hall, B.W.

1996-09-25

212

Impact of the demerit point system on road traffic accident mortality in Spain  

Microsoft Academic Search

BackgroundTo assess the effect of the Demerit Point System (DPS), introduced in Spain on 1 July 2006, on the number of fatalities due to road traffic accidents, using a methodology that controls for the seasonal variation and trend in the data series.MethodsTime-series analysis by ARIMA models of 29 113 fatalities in road traffic accidents (at the accident scene or within

José Pulido; Pablo Lardelli; Luis de la Fuente; Víctor M Flores; Fernando Vallejo; Enrique Regidor

2010-01-01

213

SAS4A: A computer model for the analysis of hypothetical core disruptive accidents in liquid metal reactors  

SciTech Connect

To ensure that the public health and safety are protected under any accident conditions in a Liquid Metal Fast Breeder Reactor (LMFBR), many accidents are analyzed for their potential consequences. The SAS4A code system, described in this paper, provides such an analysis capability, including the ability to analyze low probability events such as the Hypothetical Core Disruptive Accidents (HCDAs). The SAS4A code system has been designed to simulate all the events that occur in a LMFBR core during the initiating phase of a Hypothetical Core Disruptive Accident. During such postulated accident scenarios as the Loss-of-Flow and Transient Overpower events, a large number of interrelated physical phenomena occur during a relatively short time. These phenomena include transient heat transfer and hydrodynamic events, coolant boiling and fuel and cladding melting and relocation. During to the strong neutronic feedback present in a nuclear reactor, these events can significantly influence the reactor power. The SAS4A code system is used in the safety analysis of nuclear reactors, in order to estimate the energetic potential of very low probability accidents. The results of SAS4A simulations are also used by reactor designers in order to build safer reactors and eliminate the possibility of any accident which could endanger the public safety.

Tentner, A.M.; Birgersson, G.; Cahalan, J.E.; Dunn, F.E.; Kalimullah; Miles, K.J.

1987-01-01

214

Investigation of Human Factors in UAV Accidents Based on Analysis of Statistical Data  

Microsoft Academic Search

Human errors are held responsible for over 65% of accidents in more than one hundred years of manned aviation history. To evaluate the role of human factors related to accidents of unmanned aerial vehicles (UAVs), a sample data of 56 US Army UAV accidents was used in this study, out of which 32 were related to accidents of Hunter UAV

Manzoor M. Nasir; Qin Shi-Yin

2011-01-01

215

Contemporary Analysis of Variability in Road Traffic Accidents in Lagos State, Nigeria  

Microsoft Academic Search

The purpose of this study is to examine the spatial and temporal pattern of road traffic accidents in Lagos State, Nigeria, and to suggest preventive and corrective safety measures for reducing road traffic accidents in the study area. Road traffic accidents exert a huge burden on Nigeria's economy and health care services and current accident prevention interventions are sporadic, uncoordinated

Augustus O. Atubi; Patience C. Onokala

2009-01-01

216

Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool  

SciTech Connect

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR.

Madni, I.K. [Brookhaven National Lab., Upton, NY (United States); Eltawila, F. [Nuclear Regulatory Commission, Washington, DC (United States)

1994-01-01

217

Verification of fire and explosion accident analysis codes (facility design and preliminary results)  

SciTech Connect

For several years, the US Nuclear Regulatory Commission has sponsored the development of methods for improving capabilities to analyze the effects of postulated accidents in nuclear facilities; the accidents of interest are those that could occur during nuclear materials handling. At the Los Alamos National Laboratory, this program has resulted in three computer codes: FIRAC, EXPAC, and TORAC. These codes are designed to predict the effects of fires, explosions, and tornadoes in nuclear facilities. Particular emphasis is placed on the movement of airborne radioactive material through the gaseous effluent treatment system of a nuclear installation. The design, construction, and calibration of an experimental ventilation system to verify the fire and explosion accident analysis codes are described. The facility features a large industrial heater and several aerosol smoke generators that are used to simulate fires. Both injected thermal energy and aerosol mass can be controlled using this equipment. Explosions are simulated with H/sub 2//O/sub 2/ balloons and small explosive charges. Experimental measurements of temperature, energy, aerosol release rates, smoke concentration, and mass accumulation on HEPA filters can be made. Volumetric flow rate and differential pressures also are monitored. The initial experiments involve varying parameters such as thermal and aerosol rate and ventilation flow rate. FIRAC prediction results are presented. 10 figs.

Gregory, W.S.; Nichols, B.D.; Talbott, D.V.; Smith, P.R.; Fenton, D.L.

1985-01-01

218

Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool  

SciTech Connect

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor (LWR) nuclear power plants and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories. Brookhaven National Laboratory (BNL) has a program with the NRC called MELCOR Verification, Benchmarking, and Applications, the aim of which is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both boiling water reactors and pressurized water reactors. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). A summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR is presented.

Madni, I.K. [Brookhaven National Lab., Upton, NY (United States)

1995-11-01

219

Routes to failure: Analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system  

Microsoft Academic Search

The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring

Wen-Chin Li; Don Harris; Chung-San Yu

2008-01-01

220

Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design  

NASA Astrophysics Data System (ADS)

Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

2001-05-01

221

Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.  

PubMed

The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. PMID:23973170

Underwood, Peter; Waterson, Patrick

2014-07-01

222

Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident  

SciTech Connect

An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

Aldrich, D.C.; Blond, R.M.

1980-01-01

223

A comparative analysis of methodologies for database schema integration  

Microsoft Academic Search

One of the fundamental principles of the database approach is that a database allows a nonredundant, unified representation of all data managed in an organization. This is achieved only when methodologies are available to support integration across organizational and application boundaries. Methodologies for database design usually perform the design activity by separately producing several schemas, representing parts of the application,

Carlo Batini; Maurizio Lenzerini; Shamkant B. Navathe

1986-01-01

224

Modeling methodology for supply chain synthesis and disruption analysis  

NASA Astrophysics Data System (ADS)

The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

Wu, Teresa; Blackhurst, Jennifer

2004-11-01

225

The Analysis of PWR SBO Accident with RELAP5 Based on Linux  

NASA Astrophysics Data System (ADS)

RELAP5 is a relatively advanced light water reactor transient hydraulic and thermal analysis code, and it owns the signality of the safe-operating of nuclear reactor system when the safety analysis and operating simulation of the system was done with RELAP5. The RELAP5 operating mode based on Linux operating system was presented in this paper, utilizing Linux operating system's powerful document processing capabilities to deal with the output file of the RELAP5 for the valid data directly, and taking advantage of the system's programmable capabilities to improve the drawing functions of RELAP5. After the operating in Linux system, the precision of the calculating results is guaranteed and the period of the computing is shortened. During the work, for PWR Station Blackout (SBO) accident, the computing with RELAP5 based on Linux and Windows was respectively made. Through the comparison and analysis of the accident response curve of the main parameters such as power of nuclear reactor, average temperature and pressure of primary loop, it shows the operating analysis of nuclear reactor system is safe and reliable with RELAP5 based on Linux.

Xia, Zhimin; Zhang, Dafa

226

Analysis of dose distribution for heavily exposed workers in the first criticality accident in Japan.  

PubMed

The first criticality accident in Japan occurred in a uranium processing plant in Tokai-mura on September 30, 1999. The accident, which occurred while a large amount of enriched uranyl nitrate solution was being loaded into a tank, led to a chain reaction that continued for 20 h. Two workers who were pouring the uranium solution into the tank at the time were heterogeneously exposed to neutrons and gamma rays produced by nuclear fission. Analysis of dose distributions was essential for the understanding of the clinical course observed in the skin and organs of these workers. We developed a numerical simulation system, which consists of mathematical human models and Monte Carlo radiation transport programs, for analyzing dose distributions in various postures and applied the system to the dose analysis for the two workers. This analysis revealed the extreme heterogeneity of the doses from neutrons and gamma rays in the skin and body, which depended on the positions and postures of the workers. The detailed dose analysis presented here using color maps is indispensable for an understanding of the biological effects of high-dose exposure to a mixed field of neutrons and gamma rays as well as for the development of emergency treatments for victims of radiation exposure. PMID:12643798

Endo, Akira; Yamaguchi, Yasuhiro

2003-04-01

227

A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test  

NASA Astrophysics Data System (ADS)

According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis

Tabibzadeh, Maryam

228

Methods for Detector Placement and Analysis of Criticality Accident Alarm Systems  

SciTech Connect

Determining the optimum placement to minimize the number of detectors for a criticality accident alarm system (CAAS) in a large manufacturing facility is a complex problem. There is typically a target for the number of detectors that can be used over a given zone of the facility. A study to optimize detector placement typically begins with some initial guess at the placement of the detectors and is followed by either predictive calculations of accidents at specific locations or adjoint calculations based on preferred detector locations. Within an area of a facility, there may be a large number of potential criticality accident sites. For any given placement of the detectors, the list of accident sites can be reduced to a smaller number of locations at which accidents may be difficult for detectors to detect. Developing the initial detector placement and determining the list of difficult accident locations are both based on the practitioner's experience. Simulations following fission particles released from an accident location are called 'forward calculations.' These calculations can be used to answer the question 'where would an alarm be triggered?' by an accident at a specified location. Conversely, 'adjoint calculations' start at a detector site using the detector response function as a source and essentially run in reverse. These calculations can be used to answer the question 'where would an accident be detected?' by a specified detector location. If the number of accidents, P, is much less than the number of detectors, Q, then forward simulations may be more convenient and less time-consuming. If Q is large or the detectors are not placed yet, then a mesh tally of dose observed by a detector at any location must be computed over the entire zone. If Q is much less than P, then adjoint calculations may be more efficient. Adjoint calculations employing a mesh tally can be even more advantageous because they do not rely on a list of specific difficult-to-detect accident sites, which may not have included every possible accident location. Analog calculations (no biasing) simply follow particles naturally. For sparse buildings and line-of-sight calculations, analog Monte Carlo (MC) may be adequate. For buildings with internal walls or large amounts of heavy equipment (dense geometry), variance reduction may be required. Calculations employing the CADIS method use a deterministic calculation to create an importance map and a matching biased source distribution that optimize the final MC to quickly calculate one specific tally. Calculations employing the FW-CADIS method use two deterministic calculations (one forward and one adjoint) to create an importance map and a matching biased source distribution that are designed to make the MC calculate a mesh tally with more uniform uncertainties in both high-dose and low-dose areas. Depending on the geometry of the problem, the number of detectors, and the number of accident sites, different approaches to CAAS placement studies can be taken. These are summarized in Table I. SCALE 6.1 contains the MAVRIC sequence, which can be used to perform any of the forward-based approaches outlined in Table I. For analog calculations, MAVRIC simply calls the Monaco MC code. For CADIS and FW-CADIS, MAVRIC uses the Denovo discrete ordinates (SN) deterministic code to generate the importance map and biased source used by Monaco. An adjoint capability is currently being added to Monaco and should be available in the next release of SCALE. An adjoint-based approach could be performed with Denovo alone - although fine meshes, large amounts of memory, and long computation times may be required to obtain accurate solutions. Coarse-mesh SN simulations could be employed for adjoint-based scoping studies until the adjoint capability in Monaco is complete. CAAS placement studies, especially those dealing with mesh tallies, require some extra utilities to aid in the analysis. Detectors must receive a minimum dose rate in order to alarm; therefore, a simple yes/no plot could be more useful to the analyst t

Peplow, Douglas E. [ORNL] [ORNL; Wetzel, Larry [Babcock & Wilcox Nuclear Operations Group Inc.] [Babcock & Wilcox Nuclear Operations Group Inc.

2012-01-01

229

A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?  

NASA Technical Reports Server (NTRS)

In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

Holloway, C. M.; Johnson, C. W.

2007-01-01

230

The development of a more risk-sensitive and flexible airport safety area strategy: Part II. Accident location analysis and airport risk assessment case studies  

Microsoft Academic Search

This two-part paper presents the development of an improved airport risk assessment methodology aimed at assessing risks related to aircraft accidents at and in the vicinity of airports and managing Airport Safety Areas (ASAs) as a risk mitigation measure. The improved methodology is more quantitative, risk-sensitive, flexible and transparent than standard risk assessment approaches. As such, it contributes to the

D. K. Y. Wong; D. E. Pitfield; R. E. Caves; A. J. Appleyard

2009-01-01

231

Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident  

SciTech Connect

The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

Pasichnyk, I.; Perin, Y.; Velkov, K. [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)

2013-07-01

232

NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug  

NASA Technical Reports Server (NTRS)

A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

2005-01-01

233

Fire accident analysis modeling in support of non-reactor nuclear facilities at Sandia National Laboratories  

SciTech Connect

The Department of Energy (DOE) requires that fire hazard analyses (FHAs) be conducted for all nuclear and new facilities, with results for the latter incorporated into Title I design. For those facilities requiring safety analysis documentation, the FHA shall be documented in the Safety Analysis Reports (SARs). This paper provides an overview of the methodologies and codes being used to support FHAs at Sandia facilities, with emphasis on SARs.

Restrepo, L.F.

1993-06-01

234

Bayesian data analysis of severe fatal accident risk in the oil chain.  

PubMed

We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. PMID:22642363

Eckle, Petrissa; Burgherr, Peter

2013-01-01

235

Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR  

SciTech Connect

The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T. [Japan Nuclear Energy Safety Organization JNES, Toranomon Towers Office, 4-1-28, Toranomon, Minato-ku, Tokyo (Japan); Shirakawa, N. [Inst. of Applied Energy IAE, Shimbashi SY Bldg., 14-2 Nishi-Shimbashi 1-Chome, Minato-ku, Tokyo (Japan)

2012-07-01

236

MELCOR Analysis of Steam Generator Tube Creep Rupture in Station Blackout Severe Accident  

SciTech Connect

A pressurized water reactor steam generator tube rupture (SGTR) is of concern because it represents a bypass of the containment for radioactive materials to the environment. In a station blackout accident, tube integrity could be threatened by creep rupture, particularly if cracks are present in the tube walls. Methods are developed herein to improve assessment capabilities for SGTR by using the severe-accident code MELCOR. Best-estimate assumptions based on recent research and computational fluid dynamics calculations are applied in the MELCOR analysis to simulate two-dimensional natural circulation and to determine the relative creep-rupture timing in the reactor coolant pressure boundary components. A new method is developed to estimate the steam generator (SG) hottest tube wall temperature and the tube critical crack size for the SG tubes to fail first. The critical crack size for SG tubes to fail first is estimated to be 20% of the wall thickness larger than by a previous analysis. Sensitivity studies show that the failure sequence would change if some assumptions are modified. In particular, the uncertainty in the countercurrent flow limit model could reverse the failure sequence of the SG tubes and surge line.

Liao, Y.; Vierow, K. [Purdue University (United States)

2005-12-15

237

Process hazards analysis (PrHA) program, bridging accident analyses and operational safety  

SciTech Connect

Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker safety are incorporated so the worker can readily identify the safety parameters of the their work. System safety tools such as Preliminary Hazard Analysis, What-If Analysis, Hazard and Operability Analysis as well as other techniques as necessary provide the groundwork for both determining bounding conditions for facility safety, operational safety, and day-to-clay worker safety.

Richardson, J. A. (Jeanne A.); McKernan, S. A. (Stuart A.); Vigil, M. J. (Michael J.)

2003-01-01

238

Methodologies for analysis of patterning in the mouse RPE sheet  

PubMed Central

Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer-analyzed results were compared. Whether tallied manually or automatically with software, the resulting cell measurements were in close agreement. We compared normal with diseased RPE cells during aging with quantitative cell size and shape metrics. Subtle differences between the RPE sheet characteristics of young and old mice were identified. The IRBP?/? mouse RPE sheet did not differ from C57BL/6J (wild type, WT), suggesting that IRBP does not play a direct role in maintaining the health of the RPE cell, while the slow loss of photoreceptor (PhR) cells previously established in this knockout does support a role in the maintenance of PhR cells. Rd8 mice exhibited several measurable changes in patterns of RPE cells compared to WT, suggesting a slow degeneration of the RPE sheet that had not been previously noticed in rd8. Conclusions An optimized dissection method and a series of programs were used to establish a rapid and hands-off analysis. The software-aided, high-sampling-size approach performed as well as trained human scorers, but was considerably faster and easier. This method allows tens to hundreds of thousands of cells to be analyzed, each with 23 metrics. With this combination of dissection and image analysis of the RPE sheet, we can now analyze cell-to-cell interactions of immediate neighbors. In the future, we may be able to observe interactions of second, third, or higher ring neighbors and analyze tension in sheets, which might be expected to deviate from normal near large bumps in the RPE sheet caused by druse or when large frank holes in the RPE sheet are observed in geographic atrophy. This method and software can be readily applied to other aspects of vision science, neuroscience, and epithelial biology where patterns may exist in a sheet or surface of cells.

Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

2015-01-01

239

Narrative text analysis of accident reports with tractors, self-propelled harvesting machinery and materials handling machinery in Austrian agriculture from 2008 to 2010 - a comparison.  

PubMed

The aim of this study was the identification of accident scenarios and causes by analysing existing accident reports of recognized agricultural occupational accidents with tractors, self-propelled harvesting machinery and materials handling machinery from 2008 to 2010. As a result of a literature-based evaluation of past accident analyses, the narrative text analysis was chosen as an appropriate method. A narrative analysis of the text fields of accident reports that farmers used to report accidents to insurers was conducted to obtain detailed information about the scenarios and causes of accidents. This narrative analysis of reports was made the first time and yielded first insights for identifying antecedents of accidents and potential opportunities for technical based intervention. A literature and internet search was done to discuss and confirm the findings. The narrative text analysis showed that in more than one third of the accidents with tractors and materials handling machinery the vehicle rolled or tipped over. The most relevant accident scenarios with harvesting machinery were being trapped and falling down. The direct comparison of the analysed machinery categories showed that more than 10% of the accidents in each category were caused by technical faults, slippery or muddy terrain and incorrect or inappropriate operation of the vehicle. Accidents with tractors, harvesting machinery and materials handling machinery showed similarities in terms of causes, circumstances and consequences. Certain technical and communicative measures for accident prevention could be used for all three machinery categories. Nevertheless, some individual solutions for accident prevention, which suit each specific machine type, would be necessary. PMID:24738521

Mayrhofer, Hannes; Quendler, Elisabeth; Boxberger, Josef

2014-01-01

240

Safety analysis of beyond design basis accidents in RBMK-1500 reactors  

Microsoft Academic Search

At present the design basis accidents for RBMK-1500 are rather thoroughly investigated. The performed analyses helped to develop and implement a number of safety modifications. Further plant safety enhancement requires developing emergency procedures that would enable beyond design basis accidents management by preventing core damage or mitigating consequences of severe accidents.This paper presents results of Ignalina NPP Level 1 and

E. Ušpuras; A. Kaliatka; J. Augutis; S. Rimkevi?ius; E. Urbonavi?ius; V. Kopustinskas

2007-01-01

241

Analysis of Kuosheng Large-Break Loss-of-Coolant Accident with MELCOR 1.8.4  

SciTech Connect

The MELCOR code, developed by Sandia National Laboratories, is capable of simulating the severe accident phenomena of light water reactor nuclear power plants (NPPs). A specific large-break loss-of-coolant accident (LOCA) for Kuosheng NPP is simulated with the use of the MELCOR 1.8.4 code. This accident is induced by a double-ended guillotine break of one of the recirculation pipes concurrent with complete failure of the emergency core cooling system. The MELCOR input deck for the Kuosheng NPP is established based on the design data of the Kuosheng NPP and the MELCOR users' guides. The initial steady-state conditions are generated with a developed self-initialization algorithm. The effect of the MELCOR 1.8.4-provided initialization process is demonstrated. The main severe accident phenomena and the corresponding fission product released fractions associated with the large-break LOCA sequences are simulated. The MELCOR 1.8.4 predicts a longer time interval between the core collapse and vessel failure and a higher source term. This MELCOR 1.8.4 input deck will be applied to the probabilistic risk assessment, the severe accident analysis, and the severe accident management study of the Kuosheng NPP in the near future.

Wang, T.-C.; Wang, S.-J.; Chien, C.-S

2000-09-15

242

A Common Methodology for Safety and Reliability Analysis for Space Reactor Missions  

SciTech Connect

The thesis of this paper is that the methodology of probabilistic risk management (PRM) has the capability to integrate both safety and reliability analyses for space nuclear missions. Practiced within a decision analysis framework, the concept of risk and the overall methodology of PRM are not dependent on whether the outcome affects mission success or mission safety. This paper presents the methodology by means of simplified exampl0008.

Frank, Michael V. [Safety Factor Associates, Inc., 1410 Vanessa Circle, Encinitas, CA , 92024 (United States)

2006-01-20

243

Analysis of the SL-1 Accident Using RELAPS5-3D  

SciTech Connect

On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

Francisco, A.D. and Tomlinson, E. T.

2007-11-08

244

Spectral Reflectance Methodology in Comparison to Traditional Soil Analysis  

Microsoft Academic Search

Traditional soil analyses are expensive, time-consuming, and may also result in environmental pollutants. The objective of this study was to develop and evaluate a methodology to measure soil attributes using spectral reflectance (SR) as an alternative to traditional meth- ods. Tropical Brazilian soils were sampled over a 196-ha area divided into grids. Samples (n 5 184) were obtained from the

Marcos Rafael Nanni; José Alexandre M. Demattê

2006-01-01

245

SUPASIM: a flotation plant design and analysis methodology  

Microsoft Academic Search

A methodology was developed in the mid-1980s to predict plant performance from standard laboratory flotation tests. The technique is based on a simple, empirical kinetics model, and is tailored for use by the practical metallurgist.To date the performance of more than 20 flotation plants has been predicted, encompassing copper, lead, zinc, nickel, phosphate, pyrite, graphite, cassiterite, platinum and various slags.

M. P. Hay; C. M. Rule

2003-01-01

246

Grounded Theory and Educational Ethnography: A Methodological Analysis and Critique.  

ERIC Educational Resources Information Center

This paper analyzes and evaluates the methodological approach developed by B. G. Glaser and A. L. Strauss in THE DISCOVERY OF GROUNDED THEORY (Chicago: Aldine, 1967). Smith and Pohland's major intent is to raise Glaser and Strauss' most significant concepts and issues, analyze them in the context of seven of their own studies, and in conclusion…

Smith, Louis M.; Pohland, Paul A.

247

Survivors Perceptions of Recovery following Air Medical Transport Accidents.  

PubMed

Abstract Objective: Air medical transport (AMT) teams play an essential role in the care of the critically ill and injured. Their work, however, is not without risk. Since the inception of the industry numerous AMT accidents have been reported. The objective of this research is to gain a better understanding of the post-accident sequelae for professionals who have survived AMT accidents. The hope is that this understanding will empower the industry to better support survivors and plan for the contingencies of post-accident recovery. Methods: Qualitative methods were used to explore the experience of flight crew members who have survived an AMT accident. "Accident" was defined using criteria established by the National Transportation Safety Board. Traditional focus group methodology explored the survivors' experiences following the accident. Results: Seven survivors participated in the focus group. Content analysis revealed themes in four major domains that described the experience of survivors: Physical, Psychological, Relational and Financial. Across the themes survivors reported that industry and company response varied greatly, ranging from generous support, understanding and action to make safety improvements, to little response or action and lack of attention to survivor needs. Conclusion: Planning for AMT post-accident response was identified to be lacking in scope and quality. More focused efforts are needed to assist and support the survivors as they regain both their personal and professional lives following the accident. This planning should include all stakeholders in safe transport; the individual crewmember, air medical transport companies, and the industry at large. PMID:24932568

Jaynes, Cathy L; Valdez, Anna; Hamilton, Megan; Haugen, Krista; Henry, Colin; Jones, Pat; Werman, Howard A; White, Lynn J

2015-01-01

248

Analysis of accidents during the mid-loop operating state at a PWR  

SciTech Connect

Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

1992-01-01

249

Analysis of accidents during the mid-loop operating state at a PWR  

SciTech Connect

Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

1992-12-31

250

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1998-04-01

251

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

252

Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors  

SciTech Connect

The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

Pate-Cornell, M.E. (Stanford Univ., CA (United States))

1993-04-01

253

A rational design change methodology based on experimental and analytical modal analysis  

SciTech Connect

A design methodology that integrates analytical modeling and experimental characterization is presented. This methodology represents a powerful tool for making rational design decisions and changes. An example of its implementation in the design, analysis, and testing of a precisions machine tool support structure is given.

Weinacht, D.J.; Bennett, J.G.

1993-08-01

254

A Scalable Soft Spot Analysis Methodology for Compound Noise Effects in Nano-meter Circuits  

E-print Network

A Scalable Soft Spot Analysis Methodology for Compound Noise Effects in Nano-meter Circuits Chong@ece.ucsd.edu ABSTRACT Circuits using nano-meter technologies are becoming increasingly vulnerable to signal interference methodology to study the vulnerability of digital ICs exposed to nano-meter noise and transient soft errors

California at San Diego, University of

255

Cost-Effectiveness Analysis of Prenatal Diagnosis: Methodological Issues and Concerns  

Microsoft Academic Search

With increasing concerns regarding rapidly expanding health care costs, cost-effectiveness analysis (CEA) provides a methodology to assess whether marginal gains from new technology are worth the increased costs. In the arena of prenatal diagnosis, particular methodological and ethical concerns include whether the effects of such testing on individuals other than the patient are included, how termination of pregnancy is included

Aaron B. Caughey

2005-01-01

256

Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis  

ERIC Educational Resources Information Center

This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

Kover, Sara T.; Atwood, Amy K.

2013-01-01

257

Calculational methodology and associated uncertainties: Sensitivity and uncertainty analysis of reactor performance parameters  

Microsoft Academic Search

This chapter considers the calculational methodology and associated uncertainties both for the design of large LMFBR's and the analysis of critical assemblies (fast critical experiments) as performed by several groups within the US. Discusses cross-section processing; calculational methodology for the design problem; core physics computations; design-oriented approximations; benchmark analyses; and determination of calculational corrections and associated uncertainties for a critical

E. Kujawski; C. R. Weisbin

1982-01-01

258

PWR loss-of-coolant accident analysis capability of the WRAP-EM system  

SciTech Connect

The modular computational system known as the Water Reactor Analysis Package (WRAP) has been extended to provide the computational tools required to perform a complete analysis of loss-of-coolant accidents (LOCAs) in pressurized water reactors (PWR). The new system is known as the WRAP-EM (Evaluation Model) system and will be used by NRC to interpret and evaluate reactor vendor EM methods and computed results. The system for PWR-EM analysis is comprised of several computer codes which have been developed to analyze a particular phase of a LOCA. These codes include GAPCON for calculation of initial fuel conditions, WRAP (the previously developd SRL analog of RELAP4/MOD5) for analysis of the system blowdown and refill, the FLOOD option in WRAP for analysis of the reflood phase, and FRAP for the calculation of the behavior of the hot fuel pin. In addition, a PWR steady-state initialization procedure has been developed to provide the initial operating state of the reactor system. The PWR-EM system is operational and is being evaluated to determine the adequacy and consistency of the physical models employed for EM analysis.

Gregory, M.V.; Beranek, F.

1980-08-01

259

The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents  

NASA Technical Reports Server (NTRS)

In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

Ancel, Ersin; Shih, Ann T.

2012-01-01

260

Full-Envelope Launch Abort System Performance Analysis Methodology  

NASA Technical Reports Server (NTRS)

The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

Aubuchon, Vanessa V.

2014-01-01

261

Analysis of Radionuclide Releases from the Fukushima Dai-Ichi Nuclear Power Plant Accident Part I  

NASA Astrophysics Data System (ADS)

Part I of this publication deals with the analysis of fission product releases consecutive to the Fukushima Dai-ichi accident. Reactor core damages are assessed relying on radionuclide detections performed by the CTBTO radionuclide network, especially at the particulate station located at Takasaki, 210 km away from the nuclear power plant. On the basis of a comparison between the reactor core inventory at the time of reactor shutdowns and the fission product activities measured in air at Takasaki, especially 95Nb and 103Ru, it was possible to show that the reactor cores were exposed to high temperature for a prolonged time. This diagnosis was confirmed by the presence of 113Sn in air at Takasaki. The 133Xe assessed release at the time of reactor shutdown (8 × 1018 Bq) turned out to be in the order of 80 % of the amount deduced from the reactor core inventories. This strongly suggests a broad meltdown of reactor cores.

Le Petit, G.; Douysset, G.; Ducros, G.; Gross, P.; Achim, P.; Monfort, M.; Raymond, P.; Pontillon, Y.; Jutier, C.; Blanchard, X.; Taffary, T.; Moulin, C.

2014-03-01

262

PTSD symptom severity and psychiatric comorbidity in recent motor vehicle accident victims: a latent class analysis.  

PubMed

We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

Hruska, Bryce; Irish, Leah A; Pacella, Maria L; Sledjeski, Eve M; Delahanty, Douglas L

2014-10-01

263

Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor  

SciTech Connect

This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

1992-10-01

264

78 FR 29353 - Federal Need Analysis Methodology for the 2014-15 Award Year-Federal Pell Grant, Federal Perkins...  

Federal Register 2010, 2011, 2012, 2013

...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15 Award Year-- Federal...the statutory Federal Need Analysis Methodology that determines a student's expected...Department uses in the Federal Need Analysis Methodology to determine the EFC. Section 478...

2013-05-20

265

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

1995-01-01

266

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

1995-01-01

267

Assessment of ISLOCA risk-methodology and application to a combustion engineering plant  

SciTech Connect

Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISOLOCA core damage frequency and risk. This report presents a detailed of description of the application of this analysis methodology to a Combustion Engineering plant.

Kelly, D.L.; Auflick, J.L.; Haney, L.N. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

1992-04-01

268

Power supply noise analysis methodology for deep-submicron VLSI chip design  

Microsoft Academic Search

This paper describes a new design methodology to analyzethe on-chip power supply noise for high-performance microprocessors.Based on an integrated package-level andchip-level power bus model, and a simulated switching circuitmodel for each functional block, this methodology offersthe most complete and accurate analysis of Vdd distributionfor the entire chip. The analysis results not only providedesigners with the inductive ?I noise and the

Howard H. Chen; David D. Ling

1997-01-01

269

Regulatory analysis for the resolution of Generic Issue 82, ''Beyond design basis accidents in spent fuel pools''  

Microsoft Academic Search

Generic Issue 82, ''Beyond Design Basis Accidents in Spent Fuel Pools,'' addresses the concerns with the use of high density storage racks for the storage of spent fuel, and is applicable to all Light Water Reactor spent fuel pools. This report presents the regulatory analysis for Generic Issue 82. It includes (1) a summary of the issue, (2) a summary

Throm

1989-01-01

270

A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains  

SciTech Connect

This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

Burgherr, P.; Hirschberg, S. [Paul Scherrer Institute, Villigen (Switzerland)

2008-07-01

271

Consequence analysis of a hypothetical contained criticality accident in the Hanford Critical Mass Laboratory  

SciTech Connect

The original hazards summary report (i.e., SAR) for the CML addressed the consequences of a hypothetical accidential critical excursion occurring with the experimental assembly room open. That report indicated that the public would receive insignificant radiation exposure regardless of the type of atmospheric condition, while plant personnel could possibly receive exposures greater than the annual exposure limits for radiation workers, when a strong inversion existed. This analysis investigates the consequencs of a hypothetical accident criticality occurring with the experimental assembly room sealed. Due to the containment capabilities designed and built into the critical assembly room, the consequences are greatly reduced below those presented in HW-66266. Despite the incorporation of many extremely conservative assumptions to simplify the analysis, the radiation doses predicted for personnel 100 meters or more distant from the CML are found to be smaller than the annual radiation dose limit for members of the public in uncontrolled areas during routine, nonaccident operations. Therefore, the results of this analysis demonstrate that the occurrence of a hypothetical critical excursion within the sealed experimental assembly room at the Hanford Critical Mass Laboratory presents only a small, acceptable risk to personnel and facilities in the area and no additional safety systems or controls are needed for the continued safe operation of the CML. 11 references, 4 tables. (ACR)

Gore, B.F.; Strenge, D.L.; Mishima, J.

1984-12-01

272

Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident  

SciTech Connect

This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

1988-01-01

273

Interpretation methodology and analysis of in-flight lightning data  

NASA Technical Reports Server (NTRS)

A methodology is presented whereby electromagnetic measurements of inflight lightning stroke data can be understood and extended to other aircraft. Recent measurements made on the NASA F106B aircraft indicate that sophisticated numerical techniques and new developments in corona modeling are required to fully understand the data. Thus the problem is nontrivial and successful interpretation can lead to a significant understanding of the lightning/aircraft interaction event. This is of particular importance because of the problem of lightning induced transient upset of new technology low level microcircuitry which is being used in increasing quantities in modern and future avionics. Inflight lightning data is analyzed and lightning environments incident upon the F106B are determined.

Rudolph, T.; Perala, R. A.

1982-01-01

274

Loss of control air at Browns Ferry Unit One: accident sequence analysis  

Microsoft Academic Search

This study describes the predicted response of the Browns Ferry Nuclear Plant to a postulated complete failure of plant control air. The failure of plant control air cascades to include the loss of drywell control air at Units 1 and 2. Nevertheless, this is a benign accident unless compounded by simultaneous failures in the turbine-driven high pressure injection systems. Accident

R. M. Harrington; S. A. Hodge

1986-01-01

275

Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology  

NASA Technical Reports Server (NTRS)

The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

Atkins, H. L.

1997-01-01

276

[Traffic accidents in childhood--an analysis of 4100 cases (author's transl)].  

PubMed

Every hour a child dies as a result of an accident. Most accidents to children occur between 10.00 and 12.00 in the morning and between 15.00 and 17.00 in the afternoon. 2-6 year-old children are particularly liable to accidents. They make up 50% of all accidents to children. 20% of all accidents occur in traffic; in this group especially, the most deaths and multiple injuries are to be seen. An inquiry among 1000 children revealed that their knowledge of traffic behavior is particularly poor; according to the school standards of assessment, one third of all children would not reach the level of the class in this special field. 52% of all children in cars and 36% of all children on bicycles are in danger of their lives. PMID:406530

Willital, G H; Meier, H

1977-04-29

277

Adapting Job Analysis Methodology to Improve Evaluation Practice  

ERIC Educational Resources Information Center

This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

Jenkins, Susan M.; Curtin, Patrick

2006-01-01

278

Differences in rural and urban driver-injury severities in accidents involving large-trucks: an exploratory analysis.  

PubMed

This study explores the differences between urban and rural driver injuries (both passenger-vehicle and large-truck driver injuries) in accidents that involve large trucks (in excess of 10,000 pounds). Using 4 years of California accident data, and considering four driver-injury severity categories (no injury, complaint of pain, visible injury, and severe/fatal injury), a multinomial logit analysis of the data was conducted. Significant differences with respect to various risk factors including driver, vehicle, environmental, road geometry and traffic characteristics were found to exist between urban and rural models. For example, in rural accidents involving tractor-trailer combinations, the probability of drivers' injuries being severe/fatal increased about 26% relative to accidents involving single-unit trucks. In urban areas, this same probability increased nearly 700%. In accidents where alcohol or drug use was identified as being the primary cause of the accident, the probability of severe/fatal injury increased roughly 250% percent in rural areas and nearly 800% in urban areas. While many of the same variables were found to be significant in both rural and urban models (although often with quite different impact), there were 13 variables that significantly influenced driver-injury severity in rural but not urban areas, and 17 variables that significantly influenced driver-injury severity in urban but not rural areas. We speculate that the significant differences between rural and urban injury severities may be at least partially attributable to the different perceptual, cognitive and response demands placed on drivers in rural versus urban areas. PMID:15935320

Khorashadi, Ahmad; Niemeier, Debbie; Shankar, Venky; Mannering, Fred

2005-09-01

279

Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation  

NASA Technical Reports Server (NTRS)

A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

Campbell, Charles H.

2004-01-01

280

Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

2010-01-01

281

Homicide or accident off the coast of Florida: trauma analysis of mutilated human remains.  

PubMed

In the many years Dr. William R. Maples served as a forensic anthropologist, he saw diverse sources of trauma presented in the victims of violent crime, accident and suicide in the state of Florida. In 1996 the District 18 Medical Examiner's Office of Florida requested the assistance of Dr. Maples in the analysis of human remains recovered by the U.S. Coast Guard. The deceased was in an advanced state of decomposition characterized by skin slippage and discoloration. The torso bore multiple lacerations, including nearly parallel lacerations in the skin of the back. Specimens were carefully macerated and the fractures reconstructed. The skeletal trauma was caused by a device capable of delivering robust cuts and blunt trauma in linear paths, as is consistent with propeller trauma. Unusual in this case were blows to the ventral and dorsal surfaces of the body. Based on the anthropological analysis and interviews with the family of the deceased, the F.B.I. proceeded with the case as a homicide investigation. PMID:10432605

Stubblefield, P R

1999-07-01

282

Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

Sleight, David W.

1999-01-01

283

Partition-Recomposition Methodology for Accurate Electromagnetic Analysis of SiP Passive Circuitry  

Microsoft Academic Search

A partition-recomposition methodology based on internal port concept associated to auxiliary sources, is proposed for electromagnetic (EM) analysis of SiP (system-in-package) off-chip and on-chip passive circuitry towards optimized passive and active frequency and time domain co-simulation. Attributes of internal ports and associated de-embedding procedures for local ground references are discussed. Application of the proposed methodology to off-chip SiP multi-conductors and

Sidina Wane; Damienne Bajon

2007-01-01

284

A systems-level methodology for the analysis of inland waterway infrastructure disruptions  

Microsoft Academic Search

Inland waterways are vital to the Nation’s economic strength, and to the health and welfare of the American people. This paper\\u000a describes an end-to-end systems analysis methodology for inland waterways that incorporates infrastructure interdependencies\\u000a and cascading impacts due to loss of a key asset. The methodology leverages available Federal, State, and private-sector data,\\u000a methods, tools, expertise, and existing studies to

Steve Folga; Tim Allison; Yazmin Seda-Sanabria; Enrique Matheu; Tim Milam; Rich Ryan; Jim Peerenboom

2009-01-01

285

Methodologies and techniques for analysis of network flow data  

SciTech Connect

Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

Bobyshev, A.; Grigoriev, M.; /Fermilab

2004-12-01

286

Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications  

PubMed Central

Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

Lourenço, Célia; Turner, Claire

2014-01-01

287

Transient analysis for thermal margin with COASISO during a severe accident  

SciTech Connect

As an IVR-EVC (in-vessel retention through external vessel cooling) design concept, external cooling of the reactor vessel was suggested to protect the lower head from being overheated due to relocated material from the core during a severe accident. The COASISO (Corium Attack Syndrome Immunization Structure Outside the vessel) adopts an external vessel cooling strategy of flooding the reactor vessel inside the thermal insulator. Its advantage is the quick response time so that the initial heat removal mechanism of the EVC is nucleate boiling from the downward-facing lower head. The efficiency of the COASISO may be estimated by the thermal margin defined as the ratio of the actual heat flux from the reactor vessel to the critical heat flux (CHF). In this study the thermal margin for the large power reactor as the APR1400 (Advanced Power Reactor 1400 MWe) was determined by means of transient analysis for the local condition of the coolant and temperature distributions within the reactor vessel. The heat split fraction in the oxide pool and the metal layer focusing effect were considered during calculation of the angular thermal load at the inner wall of the lower head. The temperature distributions in the reactor vessel resulted in the actual heat flux on the outer wall. The local quality was obtained by solving the simplified transient energy equation. The unheated section of the reactor vessel decreases the thermal margin by mean of the two-dimensional conduction heat transfer. The peak temperature of the reactor vessel was estimated in the film boiling region as the thermal margin was equal to unity. Sensitivity analyses were performed for the time of corium relocation after the reactor trip, the coolant flow rate, and the initial subcooled condition of the coolant. There is no vessel failure predicted at the worst EVC condition when the stratification is not taken into account between the metal layer and the oxidic pool. The present predictive tool may be implemented in the severe accident analysis code like MAAP4 for the external vessel cooling with the COASISO. (authors)

Kim, Chan S.; Chu, Ho S.; Suh, Kune Y.; Park, Goon C.; Lee, Un C. [Seoul National University, San 56-1, Sillim-Dong, Kwanak-Gu, Seoul, 151-742 (Korea, Republic of); Yoon, Ho J. [Purdue University, West Lafayette, IN 47907 (United States)

2002-07-01

288

The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation  

NASA Technical Reports Server (NTRS)

The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

McDanels, Steven J.

2006-01-01

289

Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology  

NASA Technical Reports Server (NTRS)

The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

Knight, Norman F.

1998-01-01

290

Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis  

NASA Technical Reports Server (NTRS)

This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

Babcock, P.; Schor, A.; Rosch, G.

1998-01-01

291

A Rayleigh-Ritz analysis methodology for cutouts in composite structures  

NASA Technical Reports Server (NTRS)

A new Rayleigh-Ritz stress analysis methodology that was developed for composite panels containing cutouts is described. The procedure, which makes use of a general assumed displacement field, accommodates circular and elliptical cutouts in biaxially loaded rectangular composite panels. Symmetric integral padups around the cutout can be included in the analysis. Benchmark results are presented to demonstrate the accuracy of the technique. Strength predictions based on the average stress criterion are generated and compared with experimental data. Finally, the stress analysis methodology is integrated into a design procedure for sizing integral padups around circular cutouts, and a sample problem is solved to illustrate its use.

Russell, Steven G.

1991-01-01

292

Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR  

SciTech Connect

In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed future analyses, particularly concerning: the neutronic burn-up scheme, the modeling of the diagrid effect and the control rod drive line expansion feed-backs, as well as the primary/secondary systems thermal-hydraulics behavior. (authors)

Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D. [EDF R and D, 1, Avenue du General de Gaulle, 92141 Clamart (France); Struwe, D.; Pfrang, W.; Ponomarev, A. [Karlsruher Institut fuer Technologie KIT, Institut fuer Neutronenphysik und Reaktortechnik INR, Hermann-von-Helmholtz-Platz 1, Gebaude 521, 76344 Eggenstein-Leopoldshafen (Germany)

2012-07-01

293

A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children  

ERIC Educational Resources Information Center

Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

2012-01-01

294

Acid rain research: a review and analysis of methodology  

Microsoft Academic Search

The acidic deposition phenomena, when implicated as a factor potentially responsible for crop and forest yield losses and destruction of aquatic life, has gained increasing attention. The widespread fear that acid rain is having or may have devastating effects has prompted international debates and legislative proposals. An analysis of research on the effects of acid rain, however, reveals serious questions

1983-01-01

295

Cultural Performance Analysis Spheres: An Integrated Ethnographic Methodology  

Microsoft Academic Search

This article describes a process-oriented method called cultural performance analysis spheres. This method is designed for researchers to use at three stages of ethnographic studies on public-cultural performances: (1) as preparation before going into the field, (2) by a participant observer during the performance, and (3) as a guide for communicating scholarly discourse that must necessarily give linear form to

Kristin Bervig Valentine; Gordon Matsumoto

2001-01-01

296

Robust Methodology for Fractal Analysis of the Retinal Vasculature  

Microsoft Academic Search

We have developed a robust method to perform retinal vascular fractal analysis from digital retina images. The technique preprocesses the green channel retina images with Gabor wavelet transforms to enhance the retinal images. Fourier Fractal dimension is computed on these preprocessed images and does not require any segmentation of the vessels. This novel technique requires human input only at a

Mohd Zulfaezal Che Azemin; Dinesh Kant Kumar; Tien Y. Wong; Ryo Kawasaki; Paul Mitchell; Jie Jin Wang

2011-01-01

297

Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications  

NASA Technical Reports Server (NTRS)

An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

Taylor, Arthur C., III; Hou, Gene W.

1996-01-01

298

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

299

Statistical theory and methodology for remote sensing data analysis  

NASA Technical Reports Server (NTRS)

A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

Odell, P. L.

1974-01-01

300

Methodology of a Cladistic Analysis: How to Construct Cladograms  

NSDL National Science Digital Library

This outline details the six steps necessary for completing a cladistic analysis. The final step is to build the cladogram, following the rules that: all taxa go on the endpoints of the cladogram (never at the nodes), all cladogram nodes must have a list of synapomorphies which are common to all taxa above the nodes, and all synapomorphies appear on the cladogram only once unless the characteristic was derived separately by evolutionary parallelism. The site then explains how to test your cladogram.

301

Traffic Analysis and Road Accidents: A Case Study of Hyderabad using GIS  

NASA Astrophysics Data System (ADS)

Globalization has impacted many developing countries across the world. India is one such country, which benefited the most. Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. It is estimated that since 2001 there is an increase of 202 percent of two wheeler and 286 percent of four wheeler vehicles with no road expansion. Motor vehicle crashes are a common cause of death, disability and demand for emergency medical care. Globally, more than 1 million people die each year from traffic crashes and about 20-50 million are injured or permanently disabled. There has been increasing trend in road accidents in Hyderabad over a few years. GIS helps in locating the accident hotspots and also in analyzing the trend of road accidents in Hyderabad.

Bhagyaiah, M.; Shrinagesh, B.

2014-06-01

302

An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.  

SciTech Connect

This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

2003-09-01

303

Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project  

SciTech Connect

This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J. [Bechtel National, Inc., San Francisco, CA (United States); Laub, T.W. [Sandia National Labs., Albuquerque, NM (United States)

1992-06-01

304

Sleep, watchkeeping and accidents: a content analysis of incident at sea reports  

Microsoft Academic Search

The unique profession of seafaring involves rest and sleep in a 24-h-a-day work environment that usually involves time-zone crossings, noise, heat, cold and motion. Sleep under such conditions is often difficult to obtain, and sleeping and sleep loss are often related to fatigue and contributory to accidents. This study aims to determine how accident investigators report sleep in Incident at

Richard Phillips

2000-01-01

305

Analysis of Severe Accident Management Strategy for a BWR4 Nuclear Power Plant  

Microsoft Academic Search

The Chinshan nuclear power plant (NPP) is a Mark-I boiling water reactor (BWR) NPP located in northern Taiwan. The Chinshan NPP severe accident management guidelines (SAMGs) were developed based on the BWR Owners Group Emergency Procedure Guidelines\\/Severe Accident Guidelines and were developed at the end of 2003. The MAAP4 code has been used as a tool to validate the SAMG

T.-C. Wang; S.-J. Wang; J.-T Teng

2005-01-01

306

Injuries of the pelvic ring in road traffic accidents: a medical and technical analysis  

Microsoft Academic Search

Between 1985 and 1995, 9380 traffic accidents occurring in the area of Hannover, Germany, were analysed; 12?428 individuals had been injured and 387 (3.1%) had sustained a pelvic-ring injury (AISPELVIS>2). In 131 cases (34%), the injuries were further classified (Pennal and Tile) and a technical reconstruction made of the accident: 52% were type A, 27% type B and 21% type

M. Richter; D. Otte; A. Gänsslen; H. Bartram; T. Pohlemann

2001-01-01

307

Immunoassay Methods and their Applications in Pharmaceutical Analysis: Basic Methodology and Recent Advances  

PubMed Central

Immunoassays are bioanalytical methods in which the quantitation of the analyte depends on the reaction of an antigen (analyte) and an antibody. Immunoassays have been widely used in many important areas of pharmaceutical analysis such as diagnosis of diseases, therapeutic drug monitoring, clinical pharmacokinetic and bioequivalence studies in drug discovery and pharmaceutical industries. The importance and widespread of immunoassay methods in pharmaceutical analysis are attributed to their inherent specificity, high-throughput, and high sensitivity for the analysis of wide range of analytes in biological samples. Recently, marked improvements were achieved in the field of immunoassay development for the purposes of pharmaceutical analysis. These improvements involved the preparation of the unique immunoanalytical reagents, analysis of new categories of compounds, methodology, and instrumentation. The basic methodologies and recent advances in immunoassay methods applied in different fields of pharmaceutical analysis have been reviewed. PMID:23674985

Darwish, Ibrahim A.

2006-01-01

308

Methodology for Establishment of Integrated Flood Analysis System  

NASA Astrophysics Data System (ADS)

Flood risk management efforts face considerable uncertainty in flood hazard delineation as a consequence of changing climatic conditions including shifts in precipitation, soil moisture, and land uses. These changes can confound efforts to characterize flood impacts over decadal time scales and thus raise questions about the true benefits and drawbacks of alternative flood management projects including those of a structural and non-structural nature. Here we report an integrated flood analysis system that is designed to bring climate change information into flood risk context and characterize flood hazards in both rural and urban areas. Distributed rainfall-runoff model, one-dimensional (1D) NWS-FLDWAV model, 1D Storm Water Management Model (SWMM) and two-dimensional (2D) BreZo model are coupled. Distributed model using the multi-directional flow allocation and real time updating is used for rainfall-runoff analysis in ungauged watershed and its outputs are taken as boundary conditions to the FLDWAV model which was employed for 1D river hydraulic routing and predicting the overflow discharge at levees which were overtopped. In addition, SWMM is chosen to analyze storm sewer flow in urban areas and BreZo is used to estimate the inundation zones, depths and velocities due to the surcharge flow at sewer system or overflow at levees on the land surface. The overflow at FLDWAV or surcharged flow at SWMM becomes point sources in BreZo. Applications in Korea and California are presented.

Kim, B.; Sanders, B. F.; Kim, K.; Han, K.; Famiglietti, J. S.

2012-12-01

309

An activity-based methodology for operations cost analysis  

NASA Technical Reports Server (NTRS)

This report describes an activity-based cost estimation method, proposed for the Space Exploration Initiative (SEI), as an alternative to NASA's traditional mass-based cost estimation method. A case study demonstrates how the activity-based cost estimation technique can be used to identify the operations that have a significant impact on costs over the life cycle of the SEI. The case study yielded an operations cost of $101 billion for the 20-year span of the lunar surface operations for the Option 5a program architecture. In addition, the results indicated that the support and training costs for the missions were the greatest contributors to the annual cost estimates. A cost-sensitivity analysis of the cultural and architectural drivers determined that the length of training and the amount of support associated with the ground support personnel for mission activities are the most significant cost contributors.

Korsmeyer, David; Bilby, Curt; Frizzell, R. A.

1991-01-01

310

UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY  

SciTech Connect

It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

2013-07-01

311

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analyses is presented. New thermal finite elements which yield exact nodal and element temperature for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal-structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

312

APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR  

SciTech Connect

This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

Hamm, L.L.

1998-10-07

313

APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip  

SciTech Connect

This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

Hamm, L.L.

1998-10-07

314

Methodology for object-oriented real-time systems analysis and design: Software engineering  

NASA Technical Reports Server (NTRS)

Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

Schoeffler, James D.

1991-01-01

315

Methodological and computational considerations for multiple correlation analysis.  

PubMed

The squared multiple correlation coefficient has been widely employed to assess the goodness-of-fit of linear regression models in many applications. Although there are numerous published sources that present inferential issues and computing algorithms for multinormal correlation models, the statistical procedure for testing substantive significance by specifying the nonzero-effect null hypothesis has received little attention. This article emphasizes the importance of determining whether the squared multiple correlation coefficient is small or large in comparison with some prescribed standard and develops corresponding Excel worksheets that facilitate the implementation of various aspects of the suggested significance tests. In view of the extensive accessibility of Microsoft Excel software and the ultimate convenience of general-purpose statistical packages, the associated computer routines for interval estimation, power calculation, a nd samplesize determination are alsoprovided for completeness. The statistical methods and available programs of multiple correlation analysis described in this article purport to enhance pedagogical presentation in academic curricula and practical application in psychological research. PMID:18183885

Shieh, Gwowen; Kung, Cmen-Feng

2007-11-01

316

Acid rain research: a review and analysis of methodology  

SciTech Connect

The acidic deposition phenomena, when implicated as a factor potentially responsible for crop and forest yield losses and destruction of aquatic life, has gained increasing attention. The widespread fear that acid rain is having or may have devastating effects has prompted international debates and legislative proposals. An analysis of research on the effects of acid rain, however, reveals serious questions concerning the applicability and validity of conclusions of much of the work and thus conclusive estimations of impacts are lacking. In order to establish cause-effect relationships between rain acidity and the response of a receptor, controlled studies are necessary to verify observations in the field since there are many natural processes that produce and consume acidity and because numerous other environmental variables affect ecosystem response. Only when the response of an entire system is understood (i.e., interactions between plant, soil, soil microbes, and groundwater) can economic impacts be assessed and tolerance thresholds established for the wet deposition of acids. 14 references, 5 figures, 1 table.

Irving, P.M.

1983-01-01

317

Landscape equivalency analysis: methodology for estimating spatially explicit biodiversity credits.  

PubMed

We propose a biodiversity credit system for trading endangered species habitat designed to minimize and reverse the negative effects of habitat loss and fragmentation, the leading cause of species endangerment in the United States. Given the increasing demand for land, approaches that explicitly balance economic goals against conservation goals are required. The Endangered Species Act balances these conflicts based on the cost to replace habitat. Conservation banking is a means to manage this balance, and we argue for its use to mitigate the effects of habitat fragmentation. Mitigating the effects of land development on biodiversity requires decisions that recognize regional ecological effects resulting from local economic decisions. We propose Landscape Equivalency Analysis (LEA), a landscape-scale approach similar to HEA, as an accounting system to calculate conservation banking credits so that habitat trades do not exacerbate regional ecological effects of local decisions. Credits purchased by public agencies or NGOs for purposes other than mitigating a take create a net investment in natural capital leading to habitat defragmentation. Credits calculated by LEA use metapopulation genetic theory to estimate sustainability criteria against which all trades are judged. The approach is rooted in well-accepted ecological, evolutionary, and economic theory, which helps compensate for the degree of uncertainty regarding the effects of habitat loss and fragmentation on endangered species. LEA requires application of greater scientific rigor than typically applied to endangered species management on private lands but provides an objective, conceptually sound basis for achieving the often conflicting goals of economic efficiency and long-term ecological sustainability. PMID:16132443

Bruggeman, Douglas J; Jones, Michael L; Lupi, Frank; Scribner, Kim T

2005-10-01

318

Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine  

SciTech Connect

It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following: - During the first year after the Chernobyl accident 75-93% of Commitment Effective Dose had been formed; - During the first year after the Chernobyl accident 85-90% of damage from radiation exposure had been formed. During the next 50 years (the late phase of accident) only 10-15% of damage from radiation exposure will have been formed; - Remedial actions (agricultural remedial actions as most effective) in Ukraine are intended for reduction of the damage from consumption of production which is contaminated in the late phase of accident. I.e. agricultural remedial actions have been intended for minimization only 10 % of the total damage from radiation exposure; - Medical countermeasures can minimize radiation exposure damage by an order of magnitude greater than agricultural countermeasures. - Thus, retrospection of nuclear accident has essentially changed type of remedial actions and has given a chance to increase effectiveness of spending by an order of magnitude. This example illustrates that in order to optimize remedial actions it is required to use data of retrospection of nuclear accidents in all cases when monitoring in the early and (or) intermediate phases is unsatisfactory. (author)

Georgievskiy, Vladimir [Russian Research Center 'Kurchatov Insitute', Kurchatov Sq., 1, 123182 Moscow (Russian Federation)

2007-07-01

319

Analysis of loss-of-coolant and loss-of-flow accidents in the first wall cooling system of NET/ITER  

NASA Astrophysics Data System (ADS)

This paper presents the thermal-hydraulic analysis of potential accidents in the first wall cooling system of the Next European Torus or the International Thermonuclear Experimental Reactor. Three ex-vessel loss-of-coolant accidents, two in-vessel loss-of-coolant accidents, and three loss-of-flow accidents have been analyzed using the thermal-hydraulic system analysis code RELAP5/MOD3. The analyses deal with the transient thermal-hydraulic behavior inside the cooling systems and the temperature development inside the nuclear components during these accidents. The analysis of the different accident scenarios has been performed without operation of emergency cooling systems. The results of the analyses indicate that a loss of forced coolant flow through the first wall rapidly causes dryout in the first wall cooling pipes. Following dryout, melting in the first wall starts within about 130 s in case of ongoing plasma burning. In case of large break LOCAs and ongoing plasma burning, melting in the first wall starts about 90 s after accident initiation.

Komen, E. M. J.; Koning, H.

1994-03-01

320

WASTE-ACC: A computer model for analysis of waste management accidents  

SciTech Connect

In support of the U.S. Department of Energy`s (DOE`s) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives.

Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

1996-12-01

321

Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.  

SciTech Connect

Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

Salay, Michael (United States Nuclear Regulatory Commission, Washington, D.C.); Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

2008-10-01

322

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

323

Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models  

SciTech Connect

The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

S. A. Eide; D. M. Rasmuson; C. L. Atwood

2008-09-01

324

What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?  

PubMed

Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such as in-depth accident investigations and pre-crash data recordings. PMID:23314359

Tivesten, Emma; Wiberg, Henrik

2013-03-01

325

Development of a new methodology for stability analysis in BWR NPP  

SciTech Connect

In this work, a new methodology to reproduce power oscillations in BWR NPP is presented. This methodology comprises the modal analysis techniques, the signal analysis techniques and the simulation with the coupled code RELAP5/PARCSv2.7. Macroscopic cross sections are obtained by using the SIMTAB methodology, which is fed up with CASMO-4/SIMULATE-3 data. The input files for the neutronic and thermohydraulic codes are obtained automatically and the thermalhydraulic-to-neutronic representation (mapping) used is based on the fundamental, first and second harmonics shapes of the reactor power, calculated with the VALKIN code (developed in UPV). This mapping was chosen in order not to condition the oscillation pattern. To introduce power oscillations in the simulation a new capability in the coupled code, for generate density perturbations (both for the whole core and for chosen axial levels) according with the power modes shapes, has been implemented. The purpose of the methodology is to reproduce the driving mechanism of the out of phase oscillations appeared in BWR type reactors. In this work, the methodology is applied to the Record 9 point, collected in the NEA benchmark of Ringhals 1 NPP. A set of different perturbations are induced in the first active axial level and the LPRM signals resulting are analyzed. (authors)

Garcia-Fenoll, M.; Abarca, A.; Barrachina, T.; Miro, R.; Verdu, G. [Inst. for Industrial, Radiophysical and Environmental Safety ISIRYM, Universitat Politecnica de Valencia, Cami de Vera s/n, 46021 Valencia (Spain)

2012-07-01

326

The effects of aircraft certification rules on general aviation accidents  

NASA Astrophysics Data System (ADS)

The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

Anderson, Carolina Lenz

327

ELECTRICAL SIMULATION METHODOLOGY DEDICATED TO EMC DIGITAL CIRCUITS EMISSIONS ANALYSIS ON PCB  

E-print Network

ELECTRICAL SIMULATION METHODOLOGY DEDICATED TO EMC DIGITAL CIRCUITS EMISSIONS ANALYSIS ON PCB Jean tools are actually efficient and helpful for circuit designers separately in many domains : RF, low- signal circuits, electromagnetic, power circuits...Nevertheless, they have to take in account

Paris-Sud XI, Université de

328

Comparative Analysis of TopDown and Bottomup Methodologies for MultiAgent System Design  

E-print Network

@isi.edu ABSTRACT Traditionally, top-down and bottom-up design approaches have com- peted with each otherComparative Analysis of Top­Down and Bottom­up Methodologies for Multi­Agent System Design Terms Design, Algorithms Keywords MAS design, Top Down, Bottom­up 1. INTRODUCTION Traditionally, two

Galstyan, Aram

329

A comparison of segmental and wrist-to-ankle methodologies of bioimpedance analysis  

Microsoft Academic Search

The common approach of bioelectrical impedance analysis to estimate body water uses a wrist-to-ankle methodology which, although not indicated by theory, has the advantage of ease of application particularly for clinical studies involving patients with debilitating diseases. A number of authors have suggested the use of a segmented protocol in which the impedances of the trunk and limbs are measured

B. J. Thomas; B. H. Cornish; L. C. Ward; M. A. Patterson

1998-01-01

330

A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining  

E-print Network

A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining Jiang Bian, Josh M Informatics Brain Imaging Research Center, Psychiatric Research Institute University of Arkansas for Medical and functional brain connectivity networks and has helped researchers conceive the effects of neurological

Xie, Mengjun

331

Validation of a methodology for fuel management analysis of Laguna Verde nuclear power plant  

Microsoft Academic Search

This paper shows the validation of the fuel management methodology based on the state of the art lattice physics code HELIOS and the CM-PRESTO code, for the fuel management analysis of the Laguna Verde nuclear power plant (LVNPP). The validation of these codes is performed with data from the first five operating cycles of LVNPP Unit 1. HELIOS calculations were

J. L François; J. L Esquivel; C Cortés; J Esquivias; C Mart??n del Campo

2001-01-01

332

Mapping the geogenic radon potential: methodology and spatial analysis for central Hungary  

E-print Network

Mapping the geogenic radon potential: methodology and spatial analysis for central Hungary Katalin 2013 Available online Keywords: Hungarian geogenic radon potential Soil gas radon Soil gas permeability Spatial modeling a b s t r a c t A detailed geogenic radon potential (GRP) mapping based on field soil gas

Horváth, Ákos

333

A cost–benefit analysis methodology for assessing product adoption by older user populations  

Microsoft Academic Search

A methodology for assessing perceptions of a product through cost–benefit analysis is demonstrated in an experimental study involving use of one of two products for supporting interaction with telephone voice menu systems. The method emphasizes ratings of constructs related to costs and benefits considered relevant to decisions regarding product adoption, and the use of the analytic hierarchy process technique to

Joseph Sharit; Sara J. Czaja; Dolores Perdomo; Chin Chin Lee

2004-01-01

334

Using Exergy Analysis Methodology to Assess the Heating Efficiency of an Electric Heat Pump  

E-print Network

The authors, using exergy analysis methodology, propose that it should consider not only the COP (coefficient of Performance) value of the electric power heat pump set (EPHPS/or HP set), but also the exergy loss at the heating exchanger of the HP...

Ao, Y.; Duanmu, L.; Shen, S.

2006-01-01

335

Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*  

PubMed Central

In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

Rieder, Karen A.

1984-01-01

336

Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.  

ERIC Educational Resources Information Center

This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

2003-01-01

337

Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design  

ERIC Educational Resources Information Center

Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

Tajino, Akira; James, Robert; Kijima, Kyoichi

2005-01-01

338

Methodology for analysis of detention basins for control of urban runoff quality. Final report  

Microsoft Academic Search

The report describes an analysis methodology and presents graphs and example computations to guide planning-level evaluations and design decisions on two techniques for urban runoff quality control. The control techniques addressed, recharge or infiltration devices, and wet pond detention devices, were shown to be the most consistently effective methods of pollutant reduction of any of the Best Management Practices (BMP)

E. D. Driscoll; D. DiToro; D. Gaboury; P. Shelley

1986-01-01

339

Analysis of fMRI data sampled from Large Populations: Statistical and Methodological Issues  

E-print Network

Analysis of fMRI data sampled from Large Populations: Statistical and Methodological Issues that is present in their datasets across subjects. In particular, this high degree of variability dramatically). Analysing the variability that arises between different datasets is not straightforward, since

Paris-Sud XI, Université de

340

Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology  

ERIC Educational Resources Information Center

Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

Johnson, Tristan E.; O'Connor, Debra L.

2008-01-01

341

Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys  

Cancer.gov

Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

342

Success story in software engineering using NIAM (Natural language Information Analysis Methodology)  

SciTech Connect

To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

Eaton, S.M.; Eaton, D.S.

1995-10-01

343

A Comprehensive Analysis of the X-15 Flight 3-65 Accident  

NASA Technical Reports Server (NTRS)

The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

2014-01-01

344

Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II  

NASA Astrophysics Data System (ADS)

In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

Hu, G.; Zhao, S.; Ruan, K.

2012-01-01

345

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)  

NASA Technical Reports Server (NTRS)

Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

Johnson, C. W.; Holloway, C, M.

2007-01-01

346

Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes  

SciTech Connect

As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

1993-12-01

347

Traffic accident in Cuiabá-MT: an analysis through the data mining technology.  

PubMed

The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males. PMID:20841739

Galvão, Noemi Dreyer; de Fátima Marin, Heimar

2010-01-01

348

Independent assessment of MELCOR as a severe accident thermal-hydraulic\\/source term analysis tool  

Microsoft Academic Search

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to

I. K. Madni; F. Eltawila

1994-01-01

349

Analysis of the Effect of Speed Limit Increases on Accident-Injury Severities  

Microsoft Academic Search

The influence of speed limits on roadway safety has been a subject of continuous debate in the State of Indiana and nationwide. In Indiana, highway-related accidents result in about 900 fatalities and forty thousand injuries annually and place an incredible social and economic burden on the state. Still, speed limits posted on highways and other roads are routinely exceeded as

Nataliya V. Malyshkina; Fred L. Mannering

2008-01-01

350

Designing and evaluating a human factors investigation tool (HFIT) for accident analysis  

Microsoft Academic Search

In an attempt to improve the investigation of the human factors causes of accidents in the UK offshore oil and gas industry, a Human Factors Investigation Tool (HFIT) was developed with the sponsorship of the UK Regulator, the Health and Safety Executive, and four exploration-related companies. The tool was developed on a theoretical basis with reference to existing tools and

R. Gordon; R. Flin; K. Mearns

2005-01-01

351

Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.  

PubMed Central

A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

Goldhaber, M K; Staub, S L; Tokuhata, G K

1983-01-01

352

An Analysis of Incident/Accident Reports from the Texas Secondary School Science Safety Survey, 2001  

ERIC Educational Resources Information Center

This study investigated safety in Texas secondary school science laboratory, classroom, and field settings. The Texas Education Agency (TEA) drew a random representative sample consisting of 199 secondary public schools in Texas. Eighty-one teachers completed Incident/Accident Reports. The reports were optional, anonymous, and open-ended; thus,…

Stephenson, Amanda L.; West, Sandra S.; Westerlund, Julie F.; Nelson, Nancy C.

2003-01-01

353

MELCOR Analysis of Steam Generator Tube Creep Rupture in Station Blackout Severe Accident  

Microsoft Academic Search

A pressurized water reactor steam generator tube rupture (SGTR) is of concern because it represents a bypass of the containment for radioactive materials to the environment. In a station blackout accident, tube integrity could be threatened by creep rupture, particularly if cracks are present in the tube walls. Methods are developed herein to improve assessment capabilities for SGTR by using

Y. Liao; K. Vierow

2005-01-01

354

Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications  

NASA Technical Reports Server (NTRS)

In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

Taylor, Arthur C., III; Hou, Gene W.

1993-01-01

355

Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard  

SciTech Connect

The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

1996-08-01

356

Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129  

NASA Astrophysics Data System (ADS)

Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

2014-05-01

357

Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology  

SciTech Connect

The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves.

Fuller, R.; Harrell, J.

1996-12-01

358

Nuclear accidents  

NSDL National Science Digital Library

Accidents at nuclear power plants can be especially devastating to people and the environment. This article, part of a series about the future of energy, introduces students to nuclear accidents at Chernobyl, Three Mile Island, and Tokaimura. Students explore the incidents by examining possible causes, environmental impacts, and effects on life.

Project, Iowa P.

2004-01-01

359

[Heliogeophysical factors and aviation accidents].  

PubMed

It was shown by two independent methods that there is a certain correlation between the number of aviation accidents and heliogeophysical factors. The statistical and spectral analyses of time series of heliogeomagnetic factors and the number of aviation accidents in 1989-1995 showed that, of 216 accidents, 58% are related to sudden geomagnetic storms. A similar relation was revealed for aviation catastrophes (64% out of 86 accidents) and emergencies (54% out of 130 accidents) that coincided in time with heliogeomagnetic storms. General periodicities of the series were revealed by the method of spectral analysis, namely, cycles of 30, 42, 46, 64, 74, 83, 99, 115, 143, 169, 339 days, which confirms the causative relation between the number of aviation accidents and heliogeomagnetic factors. It is assumed that some aviation accidents that coincided in time with geomagnetic storms, are due to changes in professional abilities of pilots that were in the zone of storms. PMID:9783079

Komarov, F I; Oraevski?, V N; Sizov, Iu P; Tsirul'nik, L B; Kanonidi, Kh D; Ushakov, I B; Shalimov, P M; Kimlyk, M V; Glukhov, D V

1998-01-01

360

An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident  

SciTech Connect

An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

El-Genk, M.S.; Paramonov, D. (Institute for Space Nuclear Power Studies, Department of Chemical and Nuclear Engineering, The University of New Mexico, Albuquerque, New Mexico 87131 (United States))

1993-01-10

361

PWR (Pressurized Water Reactor) interfacing system LOCAs (loss-of-coolant accidents): Analysis of risk reduction alternatives  

SciTech Connect

This analysis suggests that the most cost-effective method to reduce the risk due to Interfacing System Loss of coolant accidents (ISLs) would be to establish a minimum testing frequency for pressure isolation valves. The suggested minimum frequency would be to perform leak testing of the pressure isolation valves at each refueling and after specific individual valve maintenance. In addition, it would also appear that the tests could be performed during descent from power without significantly increasing the risk of an ISL even and effecting considerable cost savings to the utilities.

Bozoki, G.; Kohut, P.; Fitzpatrick, R.

1988-01-01

362

Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.  

SciTech Connect

This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

2002-05-01

363

Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis  

PubMed Central

Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

2010-01-01

364

Methodology for the characterization of water quality: Analysis of time-dependent variability  

NASA Astrophysics Data System (ADS)

The general methodology for characterization of water quality here presented was applied, after elimination of spatial effects, to the analysis of time-dependent variability of physico-chemical parameters measured, on eighteen dates, during the summer months of 1976, at 112 sampling stations on the Saint Lawrence River between Cornwall and Quebec City. Two aspects of water utilization are considered: domestic water-supply and capacity to sustain balanced aquatic life. The methodology, based on use and adaptation of classical multivariate statistical methods (correspondence analysis, hierarchical classification), leads, for a given type of water utilization, to the determination of the most important parameters, of their essential interrelations and shows the relative importance of their variations. Rationalization of network operations is thus obtained through identification of homogeneous behaviour periods as well as of critical dates for the measurement of parameters characterizing a given use.

Lachance, Marius; Bobée, Bernard

1982-11-01

365

An analysis to determine correlations of freeway traffic accidents with specific geometric design features  

E-print Network

Comparison of Relative Gradient. 32 20 Comparison of Difference in Elevation Between Ramp and Freeway When Freeway is Upgrade or Downgrade. 33 21 Comparison of Number of Freeway Lanes; ADT Per Lane; Auxiliary Lane . 34 LIST OF FIGURES (CONTINUED...-lanes were down- grade or which had auxiliary lanes tended to have low accident frequency, as shown in Figures 13 and 17, respectively. Comparisons of two other 22 individual features suggested that there may have been some correlation between them...

Smith, Frank Miller

1960-01-01

366

Improved methodology for integral analysis of advanced reactors employing passive safety  

NASA Astrophysics Data System (ADS)

After four decades of experience with pressurized water reactors, a new generation of nuclear plants are emerging. These advanced designs employ passive safety which relies on natural forces, such as gravity and natural circulation. The new concept of passive safety also necessitates improvement in computational tools available for best-estimate analyses. The system codes originally designed for high pressure conditions in the presence of strong momentum sources such as pumps are challenged in many ways. Increased interaction of the primary system with the containment necessitates a tool for integral analysis. This study addresses some of these concerns. An improved tool for integral analysis coupling primary system with containment calculation is also presented. The code package is based on RELAP5 and CONTAIN programs, best-estimate thermal-hydraulics code for primary system analysis and containment code for containment analysis, respectively. The suitability is demonstrated with a postulated small break loss of coolant accident analysis of Westinghouse AP600 plant. The thesis explains the details of the analysis including the coupling model.

Muftuoglu, A. Kursad

367

Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II  

NASA Astrophysics Data System (ADS)

The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

2014-03-01

368

Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report  

SciTech Connect

This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

1997-06-01

369

Methodology for analysis of detention basins for control of urban runoff quality. Final report  

SciTech Connect

The report describes an analysis methodology and presents graphs and example computations to guide planning-level evaluations and design decisions on two techniques for urban runoff quality control. The control techniques addressed, recharge or infiltration devices, and wet pond detention devices, were shown to be the most consistently effective methods of pollutant reduction of any of the Best Management Practices (BMP) approaches evaluated in the recent Nationwide Urban Runoff Program study.

Driscoll, E.D.; DiToro, D.; Gaboury, D.; Shelley, P.

1986-09-01

370

Methodology for the analysis of pollutant emissions from a city bus  

NASA Astrophysics Data System (ADS)

In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

Armas, Octavio; Lapuerta, Magín; Mata, Carmen

2012-04-01

371

[Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems  

NASA Technical Reports Server (NTRS)

The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

Hermann, Robert

1997-01-01

372

Socio-economic Value Analysis in Geospatial and Earth Observation: A methodology review (Invited)  

NASA Astrophysics Data System (ADS)

Many industries have long since realised that applying macro-economic analysis methodologies to assess the socio-economic value of a programme is a critical step to convincing decision makers to authorise investment. The geospatial and earth observation industry has however been slow to embrace economic analysis. There are however a growing number of studies, published in the last few years, that have applied economic principles to this domain. They have adopted a variety of different approaches, including: - Computable General Equilibrium Modelling (CGE) - Revealed preference, stated preference (Willingness to Pay surveys) - Partial Analysis - Simulations - Cost-benefit analysis (with and without risk analysis) This paper will critically review these approaches and assess their applicability to different situations and to meet multiple objectives.

Coote, A. M.; Bernknopf, R.; Smart, A.

2013-12-01

373

Problem description Methodology Research  

E-print Network

Outline Background Problem description Methodology Research New Results Updates New Results Problem description Methodology Research New Results Updates Background Problem description Calibration Samples Methodology Research Principle Component Analysis Model Building Three source parameter sampling

Wolfe, Patrick J.

374

Problem description Methodology Research  

E-print Network

Outline Background Problem description Methodology Research New Results Two concerns New Results Problem description Methodology Research New Results Two concerns Background Problem description Calibration Samples Methodology Research Principle Component Analysis Model Building Three source parameter

Wolfe, Patrick J.

375

Recursive modeling of loss of control in human and organizational processes: a systemic model for accident analysis.  

PubMed

A recursive model of accident investigation is proposed by exploiting earlier work in systems thinking. Safety analysts can understand better the underlying causes of decision or action flaws by probing into the patterns of breakdown in the organization of safety. For this deeper analysis, a cybernetic model of organizational factors and a control model of human processes have been integrated in this article (i.e., the viable system model and the extended control model). The joint VSM-ECOM framework has been applied to a case study to help safety practitioners with the analysis of patterns of breakdown with regard to how operators and organizations manage goal conflicts, monitor work progress, recognize weak signals, align goals across teams, and adapt plans on the fly. The recursive accident representation brings together several organizational issues (e.g., the dilemma of autonomy versus compliance, or the interaction between structure and strategy) and addresses how operators adapt to challenges in their environment by adjusting their modes of functioning and recovery. Finally, it facilitates the transfer of knowledge from diverse incidents and near misses within similar domains of practice. PMID:22664695

Kontogiannis, Tom; Malakis, Stathis

2012-09-01

376

Transient Analysis for Evaluating the Potential Boiling in the High Elevation Emergency Cooling Units of PWR Following a Hypothetical Loss of Coolant Accident (LOCA) and Subsequent Water Hammer Due to Pump Restart  

SciTech Connect

The Generic Letter GL-96-06 issued by the U.S. Nuclear Regulatory Commission (NRC) required the utilities to evaluate the potential for voiding in their Containment Emergency Cooling Units (ECUs) due to a hypothetical Loss Of Coolant Accident (LOCA) or a Main Steam Line Break (MSLB) accompanied by the Loss Of Offsite Power (LOOP). When the offsite power is restored, the Component Cooling Water (CCW) pumps restart causing water hammer to occur due to cavity closure. Recently EPRI (Electric Power Research Institute) performed a research study that recommended a methodology to mitigate the water hammer due to cavity closure. The EPRI methodology allows for the cushioning effects of hot steam and released air, which is not considered in the conventional water column separation analysis. The EPRI study was limited in scope to the evaluation of water hammer only and did not provide any guidance for evaluating the occurrence of boiling and the extent of voiding in the ECU piping. This paper presents a complete methodology based on first principles to evaluate the onset of boiling. Also, presented is a methodology for evaluating the extent of voiding and the water hammer resulting from cavity closure by using an existing generalized computer program that is based on the Method of Characteristics. The EPRI methodology is then used to mitigate the predicted water hammer. Thus it overcomes the inherent complications and difficulties involved in performing hand calculations for water hammer. The heat transfer analysis provides an alternative to the use of very cumbersome modeling in using CFD (computational fluid dynamics) based computer programs. (authors)

Husaini, S. Mahmood; Qashu, Riyad K. [Southern California Edison, P.O. Box 128, San Clemente, CA 92672 (United States)

2004-07-01

377

Source term and radiological consequences of the Chernobyl accident  

SciTech Connect

The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

Mourad, R.; Snell, V.

1987-01-01

378

Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies  

NASA Astrophysics Data System (ADS)

This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

Zio, Enrico; Apostolakis, George E.

1999-03-01

379

New method for the efficient numerical integration of the system of ordinary differential equations from the analysis of loss-of-coolant accidents  

Microsoft Academic Search

From national topical meeting on mathematical models and computational ; techniques for analysis of nuclear systems; Ann Arbor, Michigan, USA (8 Apr 1973). ; In mathematical models and computational techniques for analysis of nuclear ; systems. The simulation of loss-of-coolant accidents with the aid of multinode ; point models requires the stepwise numerical solution of a system of nonlinear ;

E. Hofer; W. Werner

1973-01-01

380

A Gap Analysis Methodology for Collecting Crop Genepools: A Case Study with Phaseolus Beans  

PubMed Central

Background The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis) of crop wild relatives as a means to guide efficient and effective collecting activities. Methodology/Principal Findings The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5%) are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap “hotspots”, representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. Conclusions/Significance Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding). Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources. PMID:20976009

Ramírez-Villegas, Julián; Khoury, Colin; Jarvis, Andy; Debouck, Daniel Gabriel; Guarino, Luigi

2010-01-01

381

Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results  

SciTech Connect

RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

LAVENDER, J.C.

2000-10-17

382

SACO-1: a fast-running LMFBR accident-analysis code  

SciTech Connect

SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

1980-01-01

383

Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report  

SciTech Connect

This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein.

Gore, B.F.; Huenefeld, J.C.

1987-07-01

384

Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology  

NASA Technical Reports Server (NTRS)

A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

1974-01-01

385

Methodology for CFD Design Analysis of National Launch System Nozzle Manifold  

NASA Technical Reports Server (NTRS)

The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

Haire, Scot L.

1993-01-01

386

Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2  

SciTech Connect

This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

Evans, J.S. [Harvard School of Public Health, Boston, MA (United States); Abrahmson, S. [Wisconsin Univ., Madison, WI (United States); Bender, M.A. [Brookhaven National Lab., Upton, NY (United States); Boecker, B.B.; Scott, B.R. [Inhalation Toxicology Research Inst., Albuquerque, NM (United States); Gilbert, E.S. [Battelle Pacific Northwest Lab., Richland, WA (United States)

1993-10-01

387

Generating Resourcesg Assessment Methodology  

E-print Network

9/30/2014 1 Generating Resourcesg Assessment Methodology Power Committee October 7, 2014 Gillian Charles and Steve Simmons 1 Outline Generating Resources Assessment MethodologyMethodology Analysis in draft Seventh Plan analysis 4 #12;9/30/2014 3 Generating Resources Assessment Methodology Financial

388

Three years of the OCRA methodology in Brazil: critical analysis and results.  

PubMed

The Authors make a detailed analysis of the introduction of the OCRA Methodology in Brazil that started in August 2008 with the launching of the "OCRA Book" translated to Portuguese. They evaluate the importance of the assessment of the exposure of the upper limbs to the risk due to repetitive movements and efforts, according to the national and international legislation, demonstrating the interconnection of the OCRA Methodology with the Regulating Norms of the Ministry of Labor and Work (NRs - MTE), especially with the NR-17 and its Application Manual. They discuss the new paradigms of the OCRA Method in relation to the classic paradigms of the ergonomic knowledge. They indicate the OCRA Method as the tool to be used for the confirmation or not of the New Previdentiary Epidemiologic Nexus NTEP/FAP. The Authors present their conclusions based on the practical results the "participants certified by the OCRA Methodology" achieved in the application on different laboral activities in diverse economic segments, showing the risk reduction and the productivity of the companies. PMID:22316774

Ruddy, Facci; Eduardo, Marcatto; Edoardo, Santino

2012-01-01

389

An Analysis of Sediments Collected from Emersson Point and P.R., Using Three Methodologies  

NASA Astrophysics Data System (ADS)

Since the early 1950's, the development of nuclear weapons has introduced radionuclides into the environment, including radiocesium (Cs-137). Researchers have incorporated several methodologies in the identification of these isotopes and their behavior in sediments. Here we discussed three main methods used and provide study cases. Studies at Emerson Point in Tampa Bay and several sites in the Caribbean island of Puerto Rico have utilized these methodologies. The analysis of sediment grain size, the detection of radioactivity level and the identification of sediment mineralogy have been used as tools with the purpose of obtaining specific sediment data and information. Prior to analysis, sediment core samples taken at study areas were extruded, sliced into 0.5 cm for the top 10 cm and 1 cm there after and weighed. Sediment grain size analysis was done using the Micromeritics Saturn Digisizer 5200 connected to the Saturn Digisizer 2500 software to determine percentage of sand, silt and clay particles in the sediment samples. X-Ray Diffraction (XRD) is used to study crystalline structures and mineralogy of clays. Detection of radioactivity level is determined by using Canberra gamma well detectors connected to APEX software which provides radionuclide specific energy peaks for the radionuclide of interest. The results of the sample of EV (El Verde surface) shows the highest value for silt (81.36%) and the lowest clay concentration is 5.8%. Emerson Point shows the highest concentration of silt (95.29%) and the lowest clay concentration is 0.41%.

Pyrtle, A.; Berrios, C.; Torres, S.; Ithier, W.; Mayo, M.; Williams, N.; Hernández, R.

2006-12-01

390

Source terms analysis of a maximum release accident for an AGN-201M reactor  

SciTech Connect

The fundamental liability of any nuclear reactor is the possibility of exposing the public and environment to an excessive level of nuclear radiation. In a previous paper, the authors addressed the risk and potential vulnerability assessment of a maximum hypothetical release accident (MHRA) for the AGN-201M reactor at the University of New Mexico. The MHRA is defined as the total release of all radiological effluents from the reactor facility to the environment. A level I probabilistic risk assessment was performed to assess the risk to the public. The type of effluents, total activity, maximum exposure rate, and related health effects associated with an MHRA were analyzed in an attempt to identify the source term and its consequences. The source term was characterized for the worst-case scenario only because the magnitude of the released effluents is deemed ineffectual for any subcategory release.

Brumburgh, G.P.; Heger, A.S. (Univ of New Mexico, Albuquerque (United States))

1991-01-01

391

Analysis of risk reduction methods for interfacing system LOCAs (loss-of-coolant accidents) at PWRs  

SciTech Connect

The Reactor Safety Study (WASH-1400) predicted that Interfacing System Loss-of-Coolant Accidents (ISL) events were significant contributors to risk even though they were calculated to be relatively low frequency events. However, there are substantial uncertainties involved in determining the probability and consequences of the ISL sequences. For example, the assumed valve failure modes, common cause contributions and the location of the break/leak are all uncertain and can significantly influence the predicted risk from ISL events. In order to provide more realistic estimates for the core damage frequencies (CDFs) and a reduction in the magnitude of the uncertainties, a reexamination of ISL scenarios at PWRs has been performed by Brookhaven National Laboratory. The objective of this study was to investigate the vulnerability of pressurized water reactor designs to ISLs and identify any improvements that could significantly reduce the frequency/risk of these events.

Bozoki, G.; Kohut, P.; Fitzpatrick, R.

1988-01-01

392

Analysis of sertraline in postmortem fluids and tissues in 11 aviation accident victims.  

PubMed

Sertraline (Zoloft) is a selective serotonin reuptake inhibitor that is a commonly prescribed drug for the treatment of depression, obsessive-compulsive disorder, panic disorder, social anxiety disorder, premenstrual dysphoric disorder and post-traumatic stress disorder. Although the use of sertraline is relatively safe, certain side effects may negatively affect a pilot's performance and become a factor in an aviation accident. The authors' laboratory investigated the distribution of sertraline and its primary metabolite, desmethylsertraline, in various postmortem tissues and fluids obtained from 11 fatal aviation accident cases between 2001 and 2004. Eleven specimen types were analyzed for each case, including blood, urine, vitreous humor, liver, lung, kidney, spleen, muscle, brain, heart and bile. Human specimens were processed utilizing solid-phase extraction, followed by characterization and quantitation employing gas chromatography-mass spectrometry. Whole blood sertraline concentrations obtained from these 11 cases ranged from 0.005 to 0.392 µg/mL. The distribution coefficients of sertraline, expressed as specimen/blood ratio, were as follows: urine, 0.47 ± 0.39 (n = 6); vitreous humor, 0.02 ± 0.01 (n = 4); liver, 74 ± 59 (n = 11); lung, 67 ± 45 (n = 11); kidney, 7.4 ± 5 (n = 11); spleen, 46 ± 45 (n = 10); muscle, 2.1 ± 1.3 (n = 8); brain, 22 ± 14 (n = 10); heart, 9 ± 7 (n = 11); and bile, 36 ± 26 (n = 8). Postmortem distribution coefficients obtained for sertraline had coefficients of variation ranging from 47-99%. This study suggests that sertraline likely undergoes significant postmortem redistribution. PMID:23511306

Lewis, Russell J; Angier, Mike K; Williamson, Kelly S; Johnson, Robert D

2013-05-01

393

Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices  

SciTech Connect

This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

1997-06-01

394

Modeling and analysis of core debris recriticality during hypothetical severe accidents in the Advanced Neutron Source Reactor  

SciTech Connect

This paper discusses salient aspects of severe-accident-related recriticality modeling and analysis in the Advanced Neutron Source (ANS) reactor. The development of an analytical capability using the KENO V.A-SCALE system is described including evaluation of suitable nuclear cross-section sets to account for the effects of system geometry, mixture temperature, material dispersion and other thermal-hydraulic conditions. Benchmarking and validation efforts conducted with KENO V.A-SCALE and other neutronic codes against critical experiment data are described. Potential deviations and biases resulting from use of the 16-group Hansen-Roach library are shown. A comprehensive test matrix of calculations to evaluate the threat of a recriticality event in the ANS is described. Strong dependencies on geometry, material constituents, and thermal-hydraulic conditions are described. The introduction of designed mitigative features is described.

Taleyarkhan, R.P.; Kim, S.H.; Slater, C.O.; Moses, D.L.; Simpson, D.B.; Georgevich, V.

1993-05-01

395

DEFORM-4: fuel pin characterization and transient response in the SAS4A accident analysis code system  

SciTech Connect

The DEFORM-4 module is the segment of the SAS4A Accident Analysis Code System that calculates the fuel pin characterization in response to a steady state irradiation history, thereby providing the initial conditions for the transient calculation. The various phenomena considered include fuel porosity migration, fission gas bubble induced swelling, fuel cracking and healing, fission gas release, cladding swelling, and the thermal-mechanical state of the fuel and cladding. In the transient state, the module continues the thermal-mechanical response calculation, including fuel melting and central cavity pressurization, until cladding failure is predicted and one of the failed fuel modules is initiated. Comparisons with experimental data have demonstrated the validity of the modeling approach.

Miles, K.J.; Hill, D.J.

1986-01-01

396

Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system  

NASA Technical Reports Server (NTRS)

Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

Wilmington, R. P.; Klute, Glenn K. (editor); Carroll, Amy E. (editor); Stuart, Mark A. (editor); Poliner, Jeff (editor); Rajulu, Sudhakar (editor); Stanush, Julie (editor)

1992-01-01

397

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

1995-01-01

398

Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor  

NASA Astrophysics Data System (ADS)

The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.

Mkhabela, Peter Tshepo

399

Causality analysis in business performance measurement system using system dynamics methodology  

NASA Astrophysics Data System (ADS)

One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

2014-07-01

400

Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model''  

SciTech Connect

OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations.

Dmitriy Y. Anistratov; Marvin L. Adams; Todd S. Palmer; Kord S. Smith; Kevin Clarno; Hikaru Hiruta; Razvan Nes

2003-08-04

401

Cardiovascular risk analysis by means of pulse morphology and clustering methodologies.  

PubMed

The purpose of this study was the development of a clustering methodology to deal with arterial pressure waveform (APW) parameters to be used in the cardiovascular risk assessment. One hundred sixteen subjects were monitored and divided into two groups. The first one (23 hypertensive subjects) was analyzed using APW and biochemical parameters, while the remaining 93 healthy subjects were only evaluated through APW parameters. The expectation maximization (EM) and k-means algorithms were used in the cluster analysis, and the risk scores (the Framingham Risk Score (FRS), the Systematic COronary Risk Evaluation (SCORE) project, the Assessing cardiovascular risk using Scottish Intercollegiate Guidelines Network (ASSIGN) and the PROspective Cardiovascular Münster (PROCAM)), commonly used in clinical practice were selected to the cluster risk validation. The result from the clustering risk analysis showed a very significant correlation with ASSIGN (r=0.582, p<0.01) and a significant correlation with FRS (r=0.458, p<0.05). The results from the comparison of both groups also allowed to identify the cluster with higher cardiovascular risk in the healthy group. These results give new insights to explore this methodology in future scoring trials. PMID:25023535

Almeida, Vânia G; Borba, J; Pereira, H Catarina; Pereira, Tânia; Correia, Carlos; Pêgo, Mariano; Cardoso, João

2014-11-01

402

Review of the role and methodology of high resolution approaches in aroma analysis.  

PubMed

Analysis of the odour complexity in food and beverage products demands high resolution approaches for distinguishing individual aroma-impact compound(s), and for assessing their contribution to the global aroma of a sample. This paper aims to review current applications incorporating different advanced separation methodologies, and their roles in achieving high resolution aroma analysis. This includes prior low resolution gas chromatography-olfactometry (GC-O) with fractionation procedures using chemical manipulation, adsorption chromatography and ion exchange separation. Innovative multidimensional gas chromatography (MDGC) arrangements that are appropriately designed with olfactometry are of specific focus here. The revelation of resolved components using these integrated approaches provides significantly improved knowledge of aroma composition in samples. PMID:25479862

Chin, Sung-Tong; Marriott, Philip J

2015-01-01

403

Methodology for the relative risk assessment in the LDF safety analysis report  

SciTech Connect

This document provides the methodology used for the relative risk assessment performed in the LDF Safety Analysis Report. The safety analysis for a facility of the hazard level of the LDF Complex (Buildings 490L, 492 are low hazard) should be mostly qualitative. This was the approach taken for the LDF risk assessment, where qualitative descriptors were assigned to event consequences and frequencies. The event consequences and frequencies were then combined using a risk matrix to obtain an assessment of the relative risk presented by each event to LDF workers and to the public. The development of the risk matrices is the main subject of this report. The matrices have been applied in the LDF SAR (LLNL, 1997).

Brereton, S.J.

1997-09-03

404

Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans  

NASA Astrophysics Data System (ADS)

Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

2011-12-01

405

[Evidence-based practices published in Brazil: identification and analysis of their types and methodological approches].  

PubMed

This is an integrative review of Brazilian studies on evidence-based practices (EBP) in health, published in ISI/JCR journals in the last 10 years. The aim was to identify the specialty areas that most accomplished these studies, their foci and methodological approaches. Based on inclusion criteria, 144 studies were selected. The results indicate that most EBP studies addressed childhood and adolescence, infectious diseases, psychiatrics/mental health and surgery. The predominant foci were prevention, treatment/rehabilitation, diagnosis and assessment. The most used methods were systematic review with or without meta-analysis, protocol review or synthesis of available evidence studies, and integrative review. A strong multiprofessional expansion of EBP is found in Brazil, contributing to the search for more selective practices by collecting, recognizing and critically analyzing the produced knowledge. The study also contributes to the analysis itself of ways to do research and new research possibilities. PMID:21710089

Lacerda, Rúbia Aparecida; Nunes, Bruna Kosar; Batista, Arlete de Oliveira; Egry, Emiko Yoshikawa; Graziano, Kazuko Uchikawa; Angelo, Margareth; Merighi, Miriam Aparecida Barbosa; Lopes, Nadir Aparecida; Fonseca, Rosa Maria Godoy Serpa da; Castilho, Valéria

2011-06-01

406

Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans  

PubMed Central

Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

2011-01-01

407

Systems Approaches to Animal Disease Surveillance and Resource Allocation: Methodological Frameworks for Behavioral Analysis  

PubMed Central

While demands for animal disease surveillance systems are growing, there has been little applied research that has examined the interactions between resource allocation, cost-effectiveness, and behavioral considerations of actors throughout the livestock supply chain in a surveillance system context. These interactions are important as feedbacks between surveillance decisions and disease evolution may be modulated by their contextual drivers, influencing the cost-effectiveness of a given surveillance system. This paper identifies a number of key behavioral aspects involved in animal health surveillance systems and reviews some novel methodologies for their analysis. A generic framework for analysis is discussed, with exemplar results provided to demonstrate the utility of such an approach in guiding better disease control and surveillance decisions. PMID:24348922

Rich, Karl M.; Denwood, Matthew J.; Stott, Alistair W.; Mellor, Dominic J.; Reid, Stuart W. J.; Gunn, George J.

2013-01-01

408

Electrical penetration assemblies severe accident testing  

Microsoft Academic Search

The leakage behavior of electrical penetration assemblies (EPAs) is being evaluated by varying the penetration type, manufacturer, and hypothetical temperature and pressure accident profile. Nuclear qualified EPAs were procured from D.G. O'Brien, Westinghouse, and Conax. Severe accident sequence analysis was used to generate the severe accident conditions (SAC) for a large dry PWR, a BWR Mark I drywell, and a

J. D. Keck; F. V. Thome

1986-01-01

409

Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks  

NASA Technical Reports Server (NTRS)

Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.

Brown, Richard Lee

2008-01-01

410

Shipping container response to severe highway and railway accident conditions: Appendices  

SciTech Connect

Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

1987-02-01

411

Health effects models for nuclear power plant accident consequence analysis: Low LET radiation  

SciTech Connect

This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

Evans, J.S. (Harvard Univ., Boston, MA (USA). School of Public Health)

1990-01-01

412

Tourist accidents  

Microsoft Academic Search

Health issues associated with international tourism are now attracting interest from diverse researchers as they examine the interconnections between health and tourism. Despite this new popularity, no mainstream tourism journal has published any substantial research study-on this topic to date. This article examines one area of tourist health hitherto neglected in studies of travel medicine: tourist accidents. The paper commences

Stephen J. Page; Denny Meyer

1996-01-01

413

Nuclear accident  

Microsoft Academic Search

A malfunctioning valve at the Three Mile Island power plant in Pennsylvania was the prelude to the worst nuclear accident in U.S. history. Despite assurances that radiation leaked from the plant posed no immediate threat, the population around the plant dwindled as unforced weekend evacuations grew common. Radiation at the power plant site reached 30 mrem\\/hr on March 30. While

T. Mathews; S. Agrest; G. Borger; M. Lord; W. D. Marbach; W. J. Cook; M. Sheils

1979-01-01

414

Supplemental analysis of accident sequences and source terms for waste treatment and storage operations and related facilities for the US Department of Energy waste management programmatic environmental impact statement  

SciTech Connect

This report presents supplemental information for the document Analysis of Accident Sequences and Source Terms at Waste Treatment, Storage, and Disposal Facilities for Waste Generated by US Department of Energy Waste Management Operations. Additional technical support information is supplied concerning treatment of transuranic waste by incineration and considering the Alternative Organic Treatment option for low-level mixed waste. The latest respirable airborne release fraction values published by the US Department of Energy for use in accident analysis have been used and are included as Appendix D, where respirable airborne release fraction is defined as the fraction of material exposed to accident stresses that could become airborne as a result of the accident. A set of dominant waste treatment processes and accident scenarios was selected for a screening-process analysis. A subset of results (release source terms) from this analysis is presented.

Folga, S.; Mueller, C.; Nabelssi, B.; Kohout, E.; Mishima, J.

1996-12-01

415

A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium  

SciTech Connect

The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity analysis. These steps are described below.

NONE

1999-08-31

416

A probabilistic security risk assessment methodology for quantification of risk to the public  

SciTech Connect

We describe a methodology for obtaining probabilistic risk estimates of deliberate unauthorized acts, integrating estimates of frequencies of serious plots, probabilities of avoiding detection and interdiction, probabilities of successful action, and consequences of the act. This methodology allows us to compare the risks of deliberate acts with those of accidents and to identify the most cost- effective risk reduction measures through cost-benefit analysis.

Stephens, D; Futterman, J.A., Parziale, A.A.; Randazzo, A.; Warshawsky, A.S.

1996-01-19

417

Genetic screening for reproductive planning: methodological and conceptual issues in policy analysis.  

PubMed Central

OBJECTIVES: This paper explores several critical assumptions and methodological issues arising in cost-effectiveness analyses of genetic screening strategies in the reproductive setting. METHODS: Seven issues that arose in the development of a decision analysis of alternative strategies for cystic fibrosis carrier screening are discussed. Each of these issues required a choice in technique. RESULTS: The presentations of these analyses frequently mask underlying assumptions and methodological choices. Often there is no best choice. In the case of genetic screening in the reproductive setting, these underlying issues often touch on deeply felt human values. CONCLUSIONS: Space limitations for published papers often preclude explaining such choices in detail; yet these decisions determine the way the results should be interpreted. Those who develop these analyses need to make sure that the implications of important assumptions are understood by the clinicians who will use them. At the same time, clinicians need to enhance their understanding of what these models truly mean and how they address underlying clinical, ethical, and economic issues. PMID:8629720

Asch, D A; Hershey, J C; Pauly, M V; Patton, J P; Jedrziewski, M K; Mennuti, M T

1996-01-01

418

A standard methodology for the analysis, recording, and control of verbal behavior  

PubMed Central

Lack of a standard methodology has been one of the major obstacles preventing advancement of behavior analytic research in verbal behavior. This article presents a standard method for the analysis, recording, and control of verbal behavior that overcomes several major methodological problems that have hindered operant research in verbal behavior. The system divides all verbal behavior into four functional response classes, correct, error, no response, and inappropriate behavior, from which all vocal responses of a subject may be classified and consequated. The effects of contingencies of reinforcement on verbal operants within each category are made immediately visible to the researcher as changes in frequency of response. Incorporating frequency of response within each category as the unit of response allows both rate and probability of verbal response to be utilized as basic dependent variables. This method makes it possible to record and consequate verbal behavior in essentially the same way as any other operant response. It may also facilitate an experimental investigation of Skinner's verbal response categories. PMID:22477629

Drash, Philip W.; Tudor, Roger M.

1991-01-01

419

Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures  

NASA Astrophysics Data System (ADS)

A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

1994-09-01

420

Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures  

NASA Astrophysics Data System (ADS)

A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modelling strategy. The structural response for each cracked configuration is obtained using a geometrically non-linear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology, and its applicability to performing practical analyses of realistic structures, is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

1995-05-01

421

Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets  

PubMed Central

Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

2013-01-01

422

Vehicle technologies heavy vehicle program : FY 2008 benefits analysis, methodology and results --- final report.  

SciTech Connect

This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the Vehicle Technologies (VT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, and (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 08 the Heavy Vehicles program continued its involvement with various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. These changes are the result of a planning effort that first occurred during FY 04 and was updated in the past year. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY08 Budget Request. The energy savings models are utilized by the VT program for internal project management purposes.

Singh, M.; Energy Systems; TA Engineering

2008-02-29

423

Use of Forward Sensititvity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology  

SciTech Connect

Since the Code Scaling, Applicability, and Uncertainty (CSAU) methodology was proposed about two decades ago, it has been widely used for new reactor designs and existing LWRs power uprates. In spite of these huge successes, CSAU has been criticized for the need of further improvement, focusing on two main issues - lack of objectiveness and high cost. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the partial differential equations for parameter sensitivities. Moreover, our work shows that time and space steps can be treated as special sensitivity parameters so that numerical errors can be directly compared with physical uncertainties. When the FSA is implemented in a new advanced system analysis code, CSAU could be significantly improved by quantifying numerical errors and allowing a quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency. This paper will review the issues related to the current CSAU implementations, introduce FSA, show a simple example to perform FSA, and discuss potential improvements on CSAU with FSA. Finally, the general research direction and requirements to use FSA in an advanced system analysis code will be discussed.

Haihua Zhao; Vincent A. Mousseau

2012-08-01

424

Safety analysis of the IAEA reference research reactor during loss of flow accident using the code MERSAT  

Microsoft Academic Search

Using the thermal hydraulic code MERSAT detailed model including primary and secondary loop was developed for the IAEA's reference research reactor MTR 10MW. The developed model enables the simulation of expected neutronic and thermal hydraulic phenomena during normal operation, reactivity and