Note: This page contains sample records for the topic accident analysis methodology from Science.gov.
While these samples are representative of the content of Science.gov,
they are not comprehensive nor are they the most current set.
We encourage you to perform a real-time search of Science.gov
to obtain the most current and comprehensive results.
Last update: August 15, 2014.
1

Maritime accident investigation methodologies.  

PubMed

Whenever a naval disaster occurs, a public outcry is heard to a full investigation into the causes of the event. Although the maritime industry has an outstanding reputation in accident investigation, such investigations are hardly conducted in inland shipping or leisure craft sailing. Due to a number of serious accidents in the maritime sector and increasing interest by public and media, the philosophy of independent investigations has gained interest at a policy making level in the European Union and with international NGO's, such as the International Maritime Organization IMO. The purpose of this paper is to discuss the application of this methodology in all segments of shipping. The paper elaborates a conceptual model, principle processes and available techniques as a common orientation to safety-focused investigations. Accident investigation reports of Dutch investigative agencies are benchmarked to this model assessing the potential of the approach to all segments of shipping. It shows the applicability to minor as well as major accidents and the importance of independence. Systemic deficiencies at all levels in safety of shipping are identified and a generic applicability is demonstrated. It is concluded that independent accident investigation provides a powerful diagnostic tool for reducing the peril of drowning. PMID:14664367

Stoop, J A

2003-12-01

2

Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade  

SciTech Connect

The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

Sharp, G.L.; McCracken, R.T.

2003-05-13

3

Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade  

SciTech Connect

The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

Gregg L. Sharp; R. T. McCracken

2003-06-01

4

A DOE-STD-3009 hazard and accident analysis methodology for non-reactor nuclear facilities  

SciTech Connect

This paper demonstrates the use of appropriate consequence evaluation criteria in conjunction with generic likelihood of occurrence data to produce consistent hazard analysis results for nonreactor nuclear facility Safety Analysis Reports (SAR). An additional objective is to demonstrate the use of generic likelihood of occurrence data as a means for deriving defendable accident sequence frequencies, thereby enabling the screening of potentially incredible events (<10{sup {minus}6} per year) from the design basis accident envelope. Generic likelihood of occurrence data has been used successfully in performing SAR hazard and accident analyses for two nonreactor nuclear facilities at Sandia National Laboratories. DOE-STD-3009-94 addresses and even encourages use of a qualitative binning technique for deriving and ranking nonreactor nuclear facility risks. However, qualitative techniques invariably lead to reviewer requests for more details associated with consequence or likelihood of occurrence bin assignments in the test of the SAR. Hazard analysis data displayed in simple worksheet format generally elicits questions about not only the assumptions behind the data, but also the quantitative bases for the assumptions themselves (engineering judgment may not be considered sufficient by some reviewers). This is especially true where the criteria for qualitative binning of likelihood of occurrence involves numerical ranges. Oftentimes reviewers want to see calculations or at least a discussion of event frequencies or failure probabilities to support likelihood of occurrence bin assignments. This may become a significant point of contention for events that have been binned as incredible. This paper will show how the use of readily available generic data can avoid many of the reviewer questions that will inevitably arise from strictly qualitative analyses, while not significantly increasing the overall burden on the analyst.

MAHN,JEFFREY A.; WALKER,SHARON ANN

2000-03-23

5

A Methodology for Probabilistic Accident Management  

SciTech Connect

While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated.

Munteanu, Ion; Aldemir, Tunc [Ohio State University (United States)

2003-10-15

6

A comprehensive methodology for the fitting of predictive accident models.  

PubMed

Recent years have seen considerable progress in techniques for establishing relationships between accidents, flows and road or junction geometry. It is becoming increasingly recognized that the technique of generalized linear models (GLMs) offers the most appropriate and soundly-based approach for the analysis of these data. These models have been successfully used in the series of major junction accident studies carried out over the last decade by the U.K. Transport Research Laboratory (TRL). This paper describes the form of the TRL studies and the model-fitting procedures used, and gives examples of the models which have been developed. The paper also describes various technical problems which needed to be addressed in order to ensure that the application of GLMs would produce robust and reliable results. These issues included: the low mean value problem, overdispersion, the disaggregation of data over time, allowing for the presence of a trend over time in accident risk, random errors in the flow estimates, the estimation of prediction uncertainty, correlations between predictions for different accident types, and the combination of model predictions with site observations. Each of these problems has been tackled by extending or modifying the basic GLM methodology. The material described in the paper, then, constitutes a comprehensive methodology for the development of predictive accident models. PMID:8799432

Maher, M J; Summersgill, I

1996-05-01

7

QUASAR: A methodology for quantification of uncertainties in severe accident source terms  

SciTech Connect

The purpose of the present paper is to describe a methodology which was developed as part of the QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors) program at BNL. QUASAR is a large program which will apply the methodology described in this paper to severe accident sequences in LWRs using the STCP.

Khatib-Rahbar, M.; Park, C.; Pratt, W.T.; Bari, R.A.; Ryder, C.; Marino, G.

1986-01-01

8

Severe accident analysis using dynamic accident progression event trees  

NASA Astrophysics Data System (ADS)

In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a pressurized water reactor. The specific plant analyzed is the Zion Nuclear Power Plant, which is a Westinghouse-designed system that has been decommissioned.

Hakobyan, Aram P.

9

Applying STAMP in Accident Analysis  

NASA Technical Reports Server (NTRS)

Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

2003-01-01

10

Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors (QUASAR). Part 1. Methodology and Program Plan.  

National Technical Information Service (NTIS)

The methodological framework and program plan for systematic quantification and propagation of uncertainties in radiological source terms for light water reactors are presented. The QUASAR methodology is based on detailed sensitivity analysis of the Sourc...

C. Park M. Khatib-Rahbar

1986-01-01

11

A comprehensive methodology for the fitting of predictive accident models  

Microsoft Academic Search

Recent years have seen considerable progress in techniques for establishing relationships between accidents, flows and road or junction geometry. It is becoming increasingly recognized that the technique of generalized linear models (GLMs) offers the most appropriate and soundly-based approach for the analysis of these data. These models have been successfully used in the series of major junction accident studies carried

Michael J. Maher; Ian Summersgill

1996-01-01

12

Deterministic accident analysis for RBMK  

Microsoft Academic Search

Within the framework of an European Commission sponsored activity, an assessment of the deterministic safety technology of the ‘post-Chernobyl modernized’ Reactor Bolshoy Moshchnosty Kipyashiy (RBMK) has been completed. The accident analysis, limited to the area of Design Basis Accident, constituted the key subject for the study; events not including the primary circuit were not considered, as well as events originated

F. D’Auria; B. Gabaraev; S. Soloviev; O. Novoselsky; A. Moskalev; E. Uspuras; G. M. Galassi; C. Parisi; A. Petrov; V. Radkevich; L. Parafilo; D. Kryuchkov

2008-01-01

13

Wet Weather Highway Accident Analysis and Skid Resistance Data Management System. Volume 2. User's Manual.  

National Technical Information Service (NTIS)

The objectives and scope of the research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid resis...

R. C. McIlhenny K. S. Lee Y. S. Chen

1992-01-01

14

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

1929-01-01

15

Decision-problem state analysis methodology  

NASA Technical Reports Server (NTRS)

A methodology for analyzing a decision-problem state is presented. The methodology is based on the analysis of an incident in terms of the set of decision-problem conditions encountered. By decomposing the events that preceded an unwanted outcome, such as an accident, into the set of decision-problem conditions that were resolved, a more comprehensive understanding is possible. All human-error accidents are not caused by faulty decision-problem resolutions, but it appears to be one of the major areas of accidents cited in the literature. A three-phase methodology is presented which accommodates a wide spectrum of events. It allows for a systems content analysis of the available data to establish: (1) the resolutions made, (2) alternatives not considered, (3) resolutions missed, and (4) possible conditions not considered. The product is a map of the decision-problem conditions that were encountered as well as a projected, assumed set of conditions that should have been considered. The application of this methodology introduces a systematic approach to decomposing the events that transpired prior to the accident. The initial emphasis is on decision and problem resolution. The technique allows for a standardized method of accident into a scenario which may used for review or the development of a training simulation.

Dieterly, D. L.

1980-01-01

16

Criticality accident detector coverage analysis using the Monte Carlo Method.  

National Technical Information Service (NTIS)

As a result of the need for a more accurate computational methodology, the Los Alamos developed Monte Carlo code MCNP is used to show the implementation of a more advanced and accurate methodology in criticality accident detector analysis. This paper will...

J. F. Zino K. C. Okafor

1993-01-01

17

Single pilot IFR accident data analysis  

NASA Technical Reports Server (NTRS)

The aircraft accident data recorded by the National Transportation and Safety Board (NTSR) for 1964-1979 were analyzed to determine what problems exist in the general aviation (GA) single pilot instrument flight rule (SPIFR) environment. A previous study conducted in 1978 for the years 1964-1975 provided a basis for comparison. This effort was generally limited to SPIFR pilot error landing phase accidents but includes some SPIFR takeoff and enroute accident analysis as well as some dual pilot IFR accident analysis for comparison. Analysis was performed for 554 accidents of which 39% (216) occurred during the years 1976-1979.

Harris, D. F.

1983-01-01

18

ARAMIS project: a comprehensive methodology for the identification of reference accident scenarios in process industries.  

PubMed

In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage. PMID:16126337

Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno

2006-03-31

19

Industrial accidents triggered by flood events: analysis of past accidents.  

PubMed

Industrial accidents triggered by natural events (NaTech accidents) are a significant category of industrial accidents. Several specific elements that characterize NaTech events still need to be investigated. In particular, the damage mode of equipment and the specific final scenarios that may take place in NaTech accidents are key elements for the assessment of hazard and risk due to these events. In the present study, data on 272 NaTech events triggered by floods were retrieved from some of the major industrial accident databases. Data on final scenarios highlighted the presence of specific events, as those due to substances reacting with water, and the importance of scenarios involving consequences for the environment. This is mainly due to the contamination of floodwater with the hazardous substances released. The analysis of process equipment damage modes allowed the identification of the expected release extents due to different water impact types during floods. The results obtained were used to generate substance-specific event trees for the quantitative assessment of the consequences of accidents triggered by floods. PMID:19913354

Cozzani, Valerio; Campedel, Michela; Renni, Elisabetta; Krausmann, Elisabeth

2010-03-15

20

MELCOR analysis of the TMI-2 accident  

SciTech Connect

The MELCOR computer code has been used to analyze the first 174 minutes of the TMI-2 accident. MELCOR is being developed at Sandia National Laboratories for the US Nuclear Regulatory Commission for the purpose of analyzing severe accidents in nuclear power plants. Comparison of the code predictions to the available data shows that MELCOR is capable of modeling the key events of the TMI-2 accident and reasonable agreement with the available data is obtained. In particular, the core degradation and hydrogen generation models agree with best-estimate information available for this phase of the accident. While the code uses simplified modeling, all important characteristics of the reactor system and the accident phenomena could be modeled. This exercise demonstrates that MELCOR is applicable to severe accident analysis. 6 refs., 7 figs., 2 tabs.

Boucheron, E.A.; Kelly, J.E.

1988-01-01

21

MELCOR analysis of the TMI2 accident  

Microsoft Academic Search

This paper describes the analysis of the Three Mile Island-2 (TMI-2) standard problem that was performed with MELCOR. The MELCOR computer code is being developed by Sandia National Laboratories for the Nuclear Regulatory Commission for the purpose of analyzing severe accident in nuclear power plants. The primary role of MELCOR is to provide realistic predictions of severe accident phenomena and

Boucheron

1990-01-01

22

A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences  

SciTech Connect

This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

Budnitz, R.J.; Lambert, H.E.; Apostolakis, G. [and others] and others

1998-04-01

23

A CANDU Severe Accident Analysis  

SciTech Connect

As interest in severe accident studies has increased in the last years, we have developed a set of simple models to analyze severe accidents for CANDU reactors that should be integrated in the EU codes. The CANDU600 reactor uses natural uranium fuel and heavy water (D2O) as both moderator and coolant, with the moderator and coolant in separate systems. We chose to analyze accident development for a LOCA with simultaneous loss of moderator cooling and the loss of emergency core cooling system (ECCS). This type of accident is likely to modify the reactor geometry and will lead to a severe accident development. When the coolant temperatures inside a pressure tube reaches 10000 deg C, a contact between pressure tube and calandria tube occurs and the residual heat is transferred to the moderator. Due to the lack of cooling, the moderator eventually begins to boil and is expelled, through the calandria vessel relief ducts, into the containment. Therefore the calandria tubes (fuel channels) will be uncovered, then will disintegrate and fall down to the calandria vessel bottom. After all the quantity of moderator is vaporized and expelled, the debris will heat up and eventually boil. The heat accumulated in the molten debris will be transferred through the calandria vessel wall to the shield tank water, which normally surrounds the calandria vessel. The phenomena described above are modelled, analyzed and compared with the existing data. The results are encouraging. (authors)

Negut, Gheorghe; Catana, Alexandru [Institute for Nuclear Research, 1, Compului Str., Mioveni, PO Box 78, 0300 Pitesti (Romania); Prisecaru, Ilie [University Politehnica Bucharest (Romania)

2006-07-01

24

MELCOR analysis of the TMI-2 accident  

SciTech Connect

This paper describes the analysis of the Three Mile Island-2 (TMI-2) standard problem that was performed with MELCOR. The MELCOR computer code is being developed by Sandia National Laboratories for the Nuclear Regulatory Commission for the purpose of analyzing severe accident in nuclear power plants. The primary role of MELCOR is to provide realistic predictions of severe accident phenomena and the radiological source team. The analysis of the TMI-2 standard problem allowed for comparison of the model predictions in MELCOR to plant data and to the results of more mechanistic analyses. This exercise was, therefore valuable for verifying and assessing the models in the code. The major trends in the TMI-2 accident are reasonably well predicted with MELCOR, even with its simplified modeling. Comparison of the calculated and measured results is presented and, based on this comparison, conclusions can be drawn concerning the applicability of MELCOR to severe accident analysis. 5 refs., 10 figs., 3 tabs.

Boucheron, E.A.

1990-01-01

25

Risk Analysis Methodology Survey.  

National Technical Information Service (NTIS)

NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, ...

R. G. Batson

1987-01-01

26

An analysis of aircraft accidents involving fires  

NASA Technical Reports Server (NTRS)

All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

1975-01-01

27

Learning from Accident Analysis: The Dynamics Leading Up to a Rafting Accident.  

ERIC Educational Resources Information Center

Analysis of a case study of a whitewater rafting accident reveals that such accidents tend to result from multiple actions. Many events leading up to such accidents include procedural and process factors, suggesting that hard-skills technical training is an insufficient approach to accident prevention. Contains 26 references. (SAS)

Hovelynck, Johan

1998-01-01

28

Development of a new chemical process-industry accident database to assist in past accident analysis  

Microsoft Academic Search

Past accident analysis (PAA) is one of the most potent and oft-used exercises for gaining insights into the reasons why accidents occur in chemical process industry (CPI) and the damage they cause. PAA provides invaluable ‘wisdom of hindsight’ with which strategies to prevent accidents or cushion the impact of inevitable accidents can be developed.A number of databases maintain record of

S. M. Tauseef; Tasneem Abbasi; S. A. Abbasi

2011-01-01

29

Analyzing the uncertainty of simulation results in accident reconstruction with Response Surface Methodology.  

PubMed

This paper is focused on the uncertainty of simulation results in accident reconstruction. The Upper and Lower Bound Method (ULM) and the Finite Difference Method (FDM), which can be easily applied in this field, are introduced firstly; the Response Surface Methodology (RSM) is then introduced into this field as an alternative methodology. In RSM, a sample set is firstly generated via uniform design; secondly, experiments are conducted according to the sample set with the help of simulation methods; thirdly, a response surface model is determined through regression analysis; finally, the uncertainty of simulation results can be analyzed using a combination of the response surface model and existing uncertainty analysis methods. It is later discussed in detail how to generate a sample set, how to calculate the range of simulation results and how to analyze the parameter sensitivity in RSM. Finally, the feasibility of RSM is validated by five cases. Moreover, the applicability of RSM, ULM and FDM in analyzing the uncertainty of simulation results is studied; the phenomena that ULM and FDM can hardly work while RSM can is found in the latter two cases. After an analysis of these five cases and the number of simulation runs required for each method, both advantages and disadvantages of these uncertainty analysis methods are indicated. PMID:21908115

Zou, Tiefang; Cai, Ming; Du, Ronghua; Liu, Jike

2012-03-10

30

Accident analysis of nuclear power plants  

Microsoft Academic Search

Advanced methods of nuclear power plant accident analysis are aimed at ; assessing the risk arising from plant operation by use of probability ; consideration. The recent state of plant safety analysis is dealt with. This ; analysis includes recording of data gained from nuclear power plant operation and ; the assessment of malfunctions of components and systems. 27 references.

B. Kunze; H. Eichhorn

1973-01-01

31

Risk analysis methodology survey  

NASA Technical Reports Server (NTRS)

NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

Batson, Robert G.

1987-01-01

32

Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods  

SciTech Connect

The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

2000-08-01

33

Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods  

SciTech Connect

The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

2000-07-31

34

Anthropotechnological analysis of industrial accidents in Brazil.  

PubMed Central

The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise).

Binder, M. C.; de Almeida, I. M.; Monteau, M.

1999-01-01

35

Hanford Waste Tank Bump Accident Consequence Analysis  

SciTech Connect

This report provides a new evaluation of the Hanford tank bump accident for incorporation into the Safety Basis. The analysis scope is for the safe Storage of waste in its current configuration in Single-Shell and Double-Shell tanks.

TOMASZEWSKI, T.A.

2003-07-30

36

Analysis shows process industry accident losses rising  

Microsoft Academic Search

An analysis of the 150 largest losses caused by accidents and natural phenomena in the hydrocarbon processing and chemical industries during a period of 30 years ending Jan. 1, 1989, shows that the cost and number of losses is increasing. The catastrophic losses analyzed were used to develop statistical trends from the losses in a data base. The trended data

J. A. Krembs; J. M. Connolly

1990-01-01

37

Accident analysis for US fast burst reactors  

Microsoft Academic Search

In the US fast burst reactor (FBR) community there has been increasing emphasis and scrutiny on safety analysis and understanding of possible accident scenarios. This paper summarizes recent work in these areas that is going on at the different US FBR sites. At this time, all of the FBR facilities have or in the process of updating and refining their

R. Paternoster; M. Flanders; H. Kazi

1994-01-01

38

Accident analysis for US fast burst reactors  

SciTech Connect

In the US fast burst reactor (FBR) community there has been increasing emphasis and scrutiny on safety analysis and understanding of possible accident scenarios. This paper summarizes recent work in these areas that is going on at the different US FBR sites. At this time, all of the FBR facilities have or in the process of updating and refining their accident analyses. This effort is driven by two objectives: to obtain a more realistic scenario for emergency response procedures and contingency plans, and to determine compliance with changing regulatory standards.

Paternoster, R. [Los Alamos National Lab., NM (United States); Flanders, M. [Army Nuclear Effects Directorate, White Sands, NM (United States); Kazi, H. [Army Combat Systems Test Directorate, Aberdeen Proving Ground, MD (United States)

1994-09-01

39

Uncertainty Analysis of the Accident Effect Area of a BLEVE.  

National Technical Information Service (NTIS)

The report examines the uncertainty analysis of models for environmental impact analysis of large accidents with in chemical plants. Because few and unreliable data are available for large accidents, various parameters in the models are uncertain. The aut...

R. Cooke H. Meeuwissen

1989-01-01

40

Accident patterns for construction-related workers: a cluster analysis  

NASA Astrophysics Data System (ADS)

The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

Liao, Chia-Wen; Tyan, Yaw-Yauan

2011-12-01

41

Human Factors Review for Severe Accident Sequence Analysis.  

National Technical Information Service (NTIS)

The report describes a human factors research project performed to: (1) support the Severe Accident Sequence Analysis (SASA) program and (2) develop a descriptive model of operator response in accident management. The first goal was accomplished by workin...

P. A. Krois P. M. Haas J. J. Manning R. Bovell

1985-01-01

42

Coupled thermal analysis applied to the study of the rod ejection accident  

SciTech Connect

An advanced methodology for the assessment of fuel-rod thermal margins under RIA conditions has been developed by AREVA NP SAS. With the emergence of RIA analytical criteria, the study of the Rod Ejection Accident (REA) would normally require the analysis of each fuel rod, slice by slice, over the whole core. Up to now the strategy used to overcome this difficulty has been to perform separate analyses of sampled fuel pins with conservative hypotheses for thermal properties and boundary conditions. In the advanced methodology, the evaluation model for the Rod Ejection Accident (REA) integrates the node average fuel and coolant properties calculation for neutron feedback purpose as well as the peak fuel and coolant time-dependent properties for criteria checking. The calculation grid for peak fuel and coolant properties can be specified from the assembly pitch down to the cell pitch. The comparative analysis of methodologies shows that coupled methodology allows reducing excessive conservatism of the uncoupled approach. (authors)

Gonnet, M. [AREVA NP, TOUR AREVA - 1 Place Jean MILLIER, 92084 Paris La Defense Cedex (France)

2012-07-01

43

A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS  

SciTech Connect

The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

Palmrose, D E; Yang, J M

2007-05-10

44

A methodology for generating dynamic accident progression event trees for level-2 PRA  

SciTech Connect

Currently, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. A software tool (ADAPT) is described for automated APET generation using the concept of dynamic event trees. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. While the software tool could be applied to any systems analysis code, the MELCOR code is used for this illustration. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a pressurized water reactor. (authors)

Hakobyan, A.; Denning, R.; Aldemir, T. [Ohio State Univ., Nuclear Engineering Program, 650 Ackerman Road, Columbus, OH 43202 (United States); Dunagan, S.; Kunsman, D. [Sandia National Laboratory, Albuquerque, NM 87185 (United States)

2006-07-01

45

[An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].  

PubMed

The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2131982

Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

1990-03-01

46

Study of possibility using LANL PSA-methodology for accident probability RBMK researches  

SciTech Connect

The reactor facility probabilistic safety analysis methodologies are considered which are used at U.S. LANL and RF NIKIET. The methodologies are compared in order to reveal their similarity and differences, determine possibilities of using the LANL technique for RBMK type reactor safety analysis. It is found that at the PSA-1 level the methodologies practically do not differ. At LANL the PHA, HAZOP hazards analysis methods are used for more complete specification of the accounted initial event list which can be also useful at performance of PSA for RBMK. Exchange of information regarding the methodology of detection of dependent faults and consideration of human factor impact on reactor safety is reasonable. It is accepted as useful to make a comparative study result analysis for test problems or PSA fragments using various computer programs employed at NIKIET and LANL.

Petrin, S.V.; Yuferev, V.Y.; Zlobin, A.M.

1995-12-31

47

TMI-2 accident: core heat-up analysis  

SciTech Connect

This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

Ardron, K.H.; Cain, D.G.

1981-01-01

48

Analysis of accidents during instrument approaches.  

PubMed

General aviation and air taxi approach phase accidents, which occurred during Visual and Instrument Flight Rules (VFR and IFR, respectively) over the last 25 years, were analyzed. The data suggest that there is a 204% higher risk during the approach and landing phase of VFR flights, than during similar IFR operations (14.82 vs. 7.27 accidents/100,000 approaches). Alarmingly, the night single pilot IFR (SPIFR) accident rate is almost 8 times the rate of day IFR, 35.43 vs. 4.47 accidents/100,000 approaches, and two and a half times that of day VFR approaches, 35.43 vs. 14.82 accidents/100,000 approaches. Surprisingly, the overall SPIFR accident rates are not much higher than dual-pilot IFR (DPIFR), 7.27 vs. 6.48 accidents/100,000 approaches. The generally static ratio of the statistics for SPIFR/DPIFR accident rates may be accounted for by little or no change in general aviation cockpit technology during the last 25 years, and because IFR operational flight task management training has not kept pace. PMID:1610333

Bennett, C T; Schwirzke, M

1992-04-01

49

NASA's Accident Precursor Analysis Process and the International Space Station  

NASA Technical Reports Server (NTRS)

This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

Groen, Frank; Lutomski, Michael

2010-01-01

50

Analysis of reactivity induced accidents at Pakistan Research Reactor1  

Microsoft Academic Search

Analysis of reactivity induced accidents in Pakistan Research Reactor-1 (PARR-1) utilizing low enriched uranium (LEU) fuel, has been carried out using standard computer code PARET. The present core comprises of 29 standard and five control fuel elements. Various modes of reactivity insertions have been considered. The events studied include: start-up accident; accidental drop of a fuel element on the core;

I. H Bokhari; M Israr; S Pervez

2002-01-01

51

Comparative Analysis of Biosurveillance Methodologies.  

National Technical Information Service (NTIS)

The purpose of this research is to compare two different biosurveillance methodologies: BioWatch and 'A Hot Idea'. BioWatch is fielded and operating in major US cities today. Air samples are collected on filter paper and analyzed for the presence of harmf...

D. M. Kempisty

2006-01-01

52

Analysis and Research Status of Severe Core Damage Accidents. Report by Severe Core Damage Accident Research and Analysis Task Force.  

National Technical Information Service (NTIS)

The Severe Core Damage Research and Analysis Task Force was established in Nuclear Safety Research Center, Tokai Research Establishment, JAERI, in May, 1982 to make a quantitative analysis on the issues related with the severe core damage accident and als...

1984-01-01

53

Fire-accident analysis code (FIRAC) verification  

SciTech Connect

The FIRAC computer code predicts fire-induced transients in nuclear fuel cycle facility ventilation systems. FIRAC calculates simultaneously the gas-dynamic, material transport, and heat transport transients that occur in any arbitrarily connected network system subjected to a fire. The network system may include ventilation components such as filters, dampers, ducts, and blowers. These components are connected to rooms and corridors to complete the network for moving air through the facility. An experimental ventilation system has been constructed to verify FIRAC and other accident analysis codes. The design emphasizes network system characteristics and includes multiple chambers, ducts, blowers, dampers, and filters. A larger industrial heater and a commercial dust feeder are used to inject thermal energy and aerosol mass. The facility is instrumented to measure volumetric flow rate, temperature, pressure, and aerosol concentration throughout the system. Aerosol release rates and mass accumulation on filters also are measured. We have performed a series of experiments in which a known rate of thermal energy is injected into the system. We then simulated this experiment with the FIRAC code. This paper compares and discusses the gas-dynamic and heat transport data obtained from the ventilation system experiments with those predicted by the FIRAC code. The numerically predicted data generally are within 10% of the experimental data.

Nichols, B.D.; Gregory, W.S.; Fenton, D.L.; Smith, P.R.

1986-01-01

54

A methodology for the quantitative risk assessment of major accidents triggered by seismic events.  

PubMed

A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units. PMID:17276591

Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

2007-08-17

55

Accident analysis of a surveillance camera system through a frequency-based processing  

Microsoft Academic Search

The surveillance camera systems are commonly used in detecting the traffic violation on the roads. A key requirement in most detection systems is the ability for rapid automated analysis to identify the accidents. This paper describes an accident analysis based on frequency information for detecting the accident, identifying the cause of the accident and storing the accident scene. When an

JAEJOON KIM; DAE GYU LEE

2008-01-01

56

An analysis of pilot error-related aircraft accidents  

NASA Technical Reports Server (NTRS)

A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

1974-01-01

57

Rat sperm motility analysis: methodologic considerations  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

58

Development of System Response Analysis Method for Rod Ejection Accident of OPR1000 and APR1400 Using RETRAN Code  

Microsoft Academic Search

The Korea Electric Power Research Institute has developed the non-loss-of-coolant accident analysis methodology, KNAP, for the typical Optimized Power Reactor 1000 and Advanced Power Reactor 1400 in Korea. The RETRAN hot spot model (HSM) has been developed to replace the functions of STRIKIN-II code of ABB-CE. To estimate the feasibility of HSM, the typical cases of rod ejection accidents (REA)

Yo-Han KIM; Chang-Kyung SUNG; Chang-Keun YANG

2008-01-01

59

The Methodology of Data Envelopment Analysis.  

ERIC Educational Resources Information Center

The methodology of data envelopment analysis, (DEA) a linear programming-based method, is described. Other procedures often used for measuring relative productive efficiency are discussed in relation to DEA, including ratio analysis and multiple regression analysis. The DEA technique is graphically illustrated for only two inputs and one output.…

Sexton, Thomas R.

1986-01-01

60

Analysis of Credible Accidents for Argonaut Reactors  

SciTech Connect

Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

1981-04-01

61

Case for integral core-disruptive accident analysis  

SciTech Connect

Integral analysis is an approach used at the Los Alamos National Laboratory to cope with the broad multiplicity of accident paths and complex phenomena that characterize the transition phase of core-disruptive accident progression in a liquid-metal-cooled fast breeder reactor. The approach is based on the combination of a reference calculation, which is intended to represent a band of similar accident paths, and associated system- and separate-effect studies, which are designed to determine the effect of uncertainties. Results are interpreted in the context of a probabilistic framework. The approach was applied successfully in two studies; illustrations from the Clinch River Breeder Reactor licensing assessment are included.

Luck, L.B.; Bell, C.R.

1985-01-01

62

Methodology for Funds Flow Analysis.  

National Technical Information Service (NTIS)

The State Health Planning and Development Agency has recently undertaken a funds flow study. A funds flow analysis provides a state description of past health care expenditures according to the sources of fund (Federal, state, local, private payments, or ...

D. Palm

1979-01-01

63

Systemic accident analysis: examining the gap between research and practice.  

PubMed

The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

Underwood, Peter; Waterson, Patrick

2013-06-01

64

Intelligent speed adaptation: accident savings and cost-benefit analysis.  

PubMed

The UK External Vehicle Speed Control (EVSC) project has made a prediction of the accident savings with intelligent speed adaptation (ISA), and estimated the costs and benefits of national implementation. The best prediction of accident reduction was that the fitting on all vehicles of a simple mandatory system, with which it would be impossible for vehicles to exceed the speed limit, would save 20% of injury accidents and 37% of fatal accidents. A more complex version of the mandatory system, including a capability to respond to current network and weather conditions, would result in a reduction of 36% in injury accidents and 59% in fatal accidents. The implementation path recommended by the project would lead to compulsory usage in 2019. The cost-benefit analysis carried out showed that the benefit-cost ratios for this implementation strategy were in a range from 7.9 to 15.4, i.e. the payback for the system could be up to 15 times the cost of implementing and running it. PMID:15784194

Carsten, O M J; Tate, F N

2005-05-01

65

Use of Stepwise Methodology in Discriminant Analysis.  

ERIC Educational Resources Information Center

The use of stepwise methodologies has been sharply criticized by several researchers, yet their popularity, especially in educational and psychological research, continues unabated. Stepwise methods have been considered particularly well suited for use in regression and discriminant analyses, but their use in discriminant analysis (predictive…

Whitaker, Jean S.

66

A proposal for a new accident analysis method and its application to a catastrophic railway accident in Japan  

Microsoft Academic Search

In a society of highly advanced technology, a new type of compound accident takes place. In an accident analysis, recent research\\u000a concerns three aspects: human (M), technology (T) and organisation (O). However, technologies are developing day by day and\\u000a it is becoming difficult to find causes of the recent compound accidents even with M, T and O analyses. This paper

Yuji Niwa

2009-01-01

67

Soft-Error-Rate-Analysis (SERA) Methodology  

Microsoft Academic Search

We present a soft-error-rate analysis (SERA) methodology for combinational and memory circuits. SERA is based on a modeling and analysis approach that employs a judicious mix of probability theory, circuit simulation, graph theory, and fault simulation. SERA achieves five orders of magnitude speedup over Monte Carlo-based simulation approaches with less than 5% error. Dependence of the soft-error rate (SER) of

Ming Zhang; Naresh R. Shanbhag

2006-01-01

68

Analysis of Selective Parameters Contributing to Road Accidents on Highways for Establishing Suggestive Precautionary Strategies  

Microsoft Academic Search

The intent of this article is to analyze road traffic accident data recorded over a five-year period in Raipur Chhattisgarh, ascertain the causes of such accidents and suggest precautionary strategies for preventing or controlling them for the benefit of road users. An analysis of accident data recorded between 2004-2008. The causes of these accidents stem from different elements namely; the

Prema Daigavane; Preeti Bajaj

2009-01-01

69

Core Disruptive Accident Analysis using ASTERIA-FBR  

NASA Astrophysics Data System (ADS)

JNES is developing a core disruptive accident analysis code, ASTERIA-FBR, which tightly couples the thermal-hydraulics and the neutronics to simulate the core behavior during core disruptive accidents of fast breeder reactors (FBRs). ASTERIA-FBR consists of the three-dimensional thermal-hydraulics calculation module: CONCORD, the fuel pin behavior calculation module: FEMAXI-FBR, and the space-time neutronics module: Dynamic-GMVP or PARTISN/RKIN. This paper describes a comparison between characteristics of GMVP and PARTISN and summarizes the challenging issues on applying Dynamic-GMVP to the calculation against unprotected loss-of-flow (ULOF) event which is a typical initiator of core disruptive accident of FBR. The statistical error included in the calculation results may affect the super-prompt criticality during ULOF event and thus the amount of released energy.

Ishizu, Tomoko; Endo, Hiroshi; Yamamoto, Toshihisa; Tatewaki, Isao

2014-06-01

70

Cold Vacuum Drying facility design basis accident analysis documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

CROWE, R.D.

2000-08-08

71

INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS  

SciTech Connect

Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

D.A. Kalinich

1999-09-27

72

Mass Spectrometry Methodology in Lipid Analysis  

PubMed Central

Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics technology greatly push forward the study of lipidomics. Among them, mass spectrometry (MS) is the most important technology for lipid analysis. In this review, the methodology based on MS for lipid analysis was introduced. It is believed that along with the rapid development of MS and its further applications to lipid analysis, more functional lipids will be identified as biomarkers and therapeutic targets and for the study of the mechanisms of disease.

Li, Lin; Han, Juanjuan; Wang, Zhenpeng; Liu, Jian'an; Wei, Jinchao; Xiong, Shaoxiang; Zhao, Zhenwen

2014-01-01

73

Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident  

NASA Technical Reports Server (NTRS)

Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

1994-01-01

74

Methodological approaches to comparing information about bicycle accidents internationally: a case study involving Canada and Germany.  

PubMed

The use of bicycles as a mean of healthy and eco-friendly transportation is currently actively promoted in many industrialized countries. However, the number of severe bicycle accidents rose significantly in Germany and Canada in 2011. In order to identify risk factors for bicycle accidents and possible means of prevention, a study was initiated that analyses bicycle accidents from selected regions in both countries. Due to different healthcare systems and regulations, the data must be selected in different ways in each country before it can be analyzed. Data is collected by means of questionnaires in Germany and using hybrid electronic-paper records in Canada. Using this method, all relevant data can be collected in both countries. PMID:23388262

Juhra, Christian; Wieskötter, Britta; Bellwood, Paule; von Below, Ariane; Fyfe, Murray; Salkeld, Sonia; Borycki, Elizabeth; Kushniruk, Andre

2013-01-01

75

Individual plant evaluation uses of MAAP (modular accident analysis program) 3. 0B  

Microsoft Academic Search

The Modular Accident Analysis Program (MAAP) was developed by Fauske and Associates, Inc., for the Industry Degraded Core Rulemaking Program (IDCOR) as the industry tool for analysis of severe reactor accidents. The MAAP 3.0B is the most recent version of the program available for analysis of severe accidents in boiling water reactors (BWRs) and pressurized water reactors (PWRs). The MAAP

W. A. Thomas; G. T. Elicson

1989-01-01

76

MELCOR analysis of the TMI-2 accident.  

National Technical Information Service (NTIS)

This paper describes the analysis of the Three Mile Island-2 (TMI-2) standard problem that was performed with MELCOR. The MELCOR computer code is being developed by Sandia National Laboratories for the Nuclear Regulatory Commission for the purpose of anal...

E. A. Boucheron

1990-01-01

77

RAMS (Risk Analysis - Modular System) methodology  

SciTech Connect

The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

1996-10-01

78

Propulsion system failure analysis - A methodology  

NASA Astrophysics Data System (ADS)

A methodology is advanced for determining failure modes in propulsion systems while minimizing down time and providing reliable analyses. Immediate management actions are listed followed by guidelines detailing the methods for organizing a failure-analysis team, conducting a thorough investigation, and developing short- and long-term corrective actions. Control and visibility are considered two key attributes of the failure investigation, and the initial hardware inspection requires documentation and the prevention of evidence loss. An event time line should be delineated for the analysis, and the materials and data analyses are considered extensions of the time line. Failure scenarios and fabrication histories are also needed for the failure analysis of the propulsion system followed by recurrence control so that the program can be restored.

Biggs, R. E.

1992-07-01

79

Analysis of Construction Quality Accident Causes of Public Buildings Based on Failure Study Theories  

Microsoft Academic Search

In order to study the superficial and deep causes of public work quality accidents, we have collected 27 cases of construction quality accidents of public buildings publicized by the Japanese media. Statistic analysis is conducted by using the cause factor analysis method of failure study. We have discussed the superficial causes of construction quality accidents of public building, and deep

Jinrong Cui

2011-01-01

80

Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

81

Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

82

Human Error and Commercial Aviation Accidents: An Analysis Using the Human Factors Analysis and Classification System  

Microsoft Academic Search

Objective: The aim of this study was to extend previous examinations of aviation accidents to include specific aircrew, environmental, supervisory, and organizational factors associated with two types of commercial aviation (air carrier and commuter\\/ on-demand) accidents using the Human Factors Analysis and Classification System (HFACS). Background: HFACS is a theoretically based tool for investigating and analyzing human error associated with

Scott Shappell; Cristy Detwiler; Kali Holcomb; Carla Hackworth; Albert Boquet; Douglas A. Wiegmann

2007-01-01

83

Systems biology data analysis methodology in pharmacogenomics  

PubMed Central

Pharmacogenetics aims to elucidate the genetic factors underlying the individual’s response to pharmacotherapy. Coupled with the recent (and ongoing) progress in high-throughput genotyping, sequencing and other genomic technologies, pharmacogenetics is rapidly transforming into pharmacogenomics, while pursuing the primary goals of identifying and studying the genetic contribution to drug therapy response and adverse effects, and existing drug characterization and new drug discovery. Accomplishment of both of these goals hinges on gaining a better understanding of the underlying biological systems; however, reverse-engineering biological system models from the massive datasets generated by the large-scale genetic epidemiology studies presents a formidable data analysis challenge. In this article, we review the recent progress made in developing such data analysis methodology within the paradigm of systems biology research that broadly aims to gain a ‘holistic’, or ‘mechanistic’ understanding of biological systems by attempting to capture the entirety of interactions between the components (genetic and otherwise) of the system.

Rodin, Andrei S; Gogoshin, Grigoriy; Boerwinkle, Eric

2012-01-01

84

Comprehensive Analysis of Two Downburst-Related Aircraft Accidents  

NASA Technical Reports Server (NTRS)

Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

Shen, J.; Parks, E. K.; Bach, R. E.

1996-01-01

85

A grounded theory model for analysis of marine accidents.  

PubMed

The purpose of this paper was to design a conceptual model for analysis of marine accidents. The model is grounded on large amounts of empirical data, i.e. the Swedish Maritime Administration database, which was thoroughly studied. This database contains marine accidents organized by ship and variable. The majority of variables are non-metric and some have never been analyzed because of the large number of values. Summary statistics were employed in the data analysis. In order to develop a conceptual model, the database variables were clustered into eleven main categories or constructs, which were organized according to their properties and connected with the path diagram of relationships. For demonstration purposes, one non-metric and five metric variables were selected, namely fatality, ship's properties (i.e. age, gross register tonnage, and length), number of people on board, and marine accidents. These were analyzed using the structural equation modeling (SEM) approach. The combined prediction power of the 'ship's properties' and 'number of people on board' independent variables accounted for 65% of the variance of the fatality. The model development was largely based on the data contained in the Swedish database. However, as this database shares a number of variables in common with other databases in the region and the world, the model presented in this paper could be applied to other datasets. The model has both theoretical and practical values. Recommendations for improvements in the database are also suggested. PMID:21545894

Mullai, Arben; Paulsson, Ulf

2011-07-01

86

Three Dimensional Analysis of 3-Loop PWR RCCA Ejection Accident for High Burnup  

SciTech Connect

The Rod Control Cluster Assembly (RCCA) ejection accident is a Condition IV design basis reactivity insertion event for Pressurized Water Reactors (PWR). The event is historically analyzed using a one-dimensional (1D) neutron kinetic code to meet the current licensing criteria for fuel rod burnup to 62,000 MWD/MTU. The Westinghouse USNRC-approved three-dimensional (3D) analysis methodology is based on the neutron kinetics version of the ANC code (SPNOVA) coupled with Westinghouse's version of the EPRI core thermal-hydraulic code VIPRE-01. The 3D methodology provides a more realistic yet conservative analysis approach to meet anticipated reduction in the licensing fuel enthalpy rise limit for high burnup fuel. A rod ejection analysis using the 3D methodology was recently performed for a Westinghouse 3-loop PWR at an up-rated core power of 3151 MWt with reload cores that allow large flexibility in assembly shuffling and a fuel hot rod burnup to 75,000 MWD/MTU. The analysis considered high enrichment fuel assemblies at the control rod locations as well as bounding rodded depletions in the end of life, zero power and full power conditions. The analysis results demonstrated that the peak fuel enthalpy rise is less than 100 cal/g for the transient initiated at the hot zero power condition. The maximum fuel enthalpy is less than 200 cal/g for the transient initiated from the full power condition. (authors)

Marciulescu, Cristian; Sung, Yixing; Beard, Charles L. [Westinghouse Electric Company, LLC (United States)

2006-07-01

87

A DISCIPLINED APPROACH TO ACCIDENT ANALYSIS DEVELOPMENT AND CONTROL SELECTION  

SciTech Connect

The development and use of a Safety Input Review Committee (SIRC) process promotes consistent and disciplined Accident Analysis (AA) development to ensure that it accurately reflects facility design and operation; and that the credited controls are effective and implementable. Lessons learned from past efforts were reviewed and factored into the development of this new process. The implementation of the SIRC process has eliminated many of the problems previously encountered during Safety Basis (SB) document development. This process has been subsequently adopted for use by several Savannah River Site (SRS) facilities with similar results and expanded to support other analysis activities.

Ortner, T; Mukesh Gupta, M

2007-04-13

88

Cost analysis methodology: Photovoltaic Manufacturing Technology Project  

SciTech Connect

This report describes work done under Phase 1 of the Photovoltaic Manufacturing Technology (PVMaT) Project. PVMaT is a five-year project to support the translation of research and development in PV technology into the marketplace. PVMaT, conceived as a DOE/industry partnership, seeks to advanced PV manufacturing technologies, reduce PV module production costs, increase module performance, and expand US commercial production capacities. Under PVMaT, manufacturers will propose specific manufacturing process improvements that may contribute to the goals of the project, which is to lessen the cost, thus hastening entry into the larger scale, grid-connected applications. Phase 1 of the PVMaT project is to identify obstacles and problems associated with manufacturing processes. This report describes the cost analysis methodology required under Phase 1 that will allow subcontractors to be ranked and evaluated during Phase 2.

Whisnant, R.A. (Research Triangle Inst., Research Triangle Park, NC (United States))

1992-09-01

89

An Integrated Accident & Consequence Analysis Approach for Accidental Releases through Multiple Leak Paths  

SciTech Connect

This paper presents a consequence analysis for a postulated fire accident on a building containing plutonium when the resulting outside release is partly through the ventilation/filtration system and partly through other pathways such as building access doorways. When analyzing an accident scenario involving the release of radioactive powders inside a building, various pathways for the release to the outside environment can exist. This study is presented to guide the analyst on how the multiple building leak path factors (combination of filtered and unfiltered releases) can be evaluated in an integrated manner starting with the source term calculation and proceeding through the receptor consequence determination. The analysis is performed in a two-step process. The first step of the analysis is to calculate the leak path factor, which represents the fraction of respirable radioactive powder that is made airborne that leaves the building through the various pathways. The computer cod e of choice for this determination is MELCOR. The second step is to model the transport and dispersion of powder material released to the atmosphere and to estimate the resulting dose that is received by the downwind receptors of interest. The MACCS computer code is chosen for this part of the analysis. This work can be used as model for performing analyses for systems similar in nature where releases can propagate to the outside environment via filtered and unfiltered pathways. The methodology provides guidance to analysts outlining the essential steps needed to perform a sound and defensible consequence analysis.

POLIZZI, LM

2004-04-28

90

Human Error Analysis of Commercial Aviation Accidents Using the Human Factors Analysis and Classification System (HFACS).  

National Technical Information Service (NTIS)

The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based upon Reasons (...

D. A. Wiegmann S. A. Shappell

2001-01-01

91

Human Error Analysis of Commercial Aviation Accidents Using the Human Factors Analysis and Classification System (HFACS).  

National Technical Information Service (NTIS)

The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based upon Reason's ...

D. A. Wiegmann S. A. Shappell

2001-01-01

92

Nuclear criticality safety tools in the Chernobyl-4 accident analysis  

SciTech Connect

The collaboration with the Italian Safety Authority (DISP), started in July 1986, has the aim of studying, from a neutronic point of view, the possible initiator event and the accident dynamics in unit four of the Chernobly nuclear power plant. This report was produced within the framework of that collaboration. A main condition of the present work was making use of standard calculational methods employed in nuclear criticality safety analysis. This means that the neutron multiplication factor calculation should be made with the modules and the cross-section libraries of the SCALE system or in any case with some KENO IV version and the burnup calculation with the ORIGEN code.

Landeyro, P.A.

1988-01-01

93

Predicting System Accidents with Model Analysis During Hybrid Simulation  

NASA Technical Reports Server (NTRS)

Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

Malin, Jane T.; Fleming, Land D.; Throop, David R.

2002-01-01

94

Statistical analysis of accident severity on rural freeways.  

PubMed

The growing concern about the possible safety-related impacts of Intelligent Transportation Systems (ITS) has focused attention on the need to develop new statistical approaches to predict accident severity. This paper presents a nested logit formulation as a means for determining accident severity given that an accident has occurred. Four levels of severity are considered: (1) property damage only, (2) possible injury, (3) evident injury, and (4) disabling injury or fatality. Using 5-year accident data from a 61 km section of rural interstate in Washington State (which has been selected as an ITS demonstration site), we estimate a nested logit model of accident severity. The estimation results provide valuable evidence on the effect that environmental conditions, highway design, accident type, driver characteristics and vehicle attributes have on accident severity. Our findings show that the nested logit formulation is a promising approach to evaluate the impact that ITS or other safety-related countermeasures may have on accident severities. PMID:8799444

Shankar, V; Mannering, F; Barfield, W

1996-05-01

95

NASA Accident Precursor Analysis Handbook, Version 1.0.  

National Technical Information Service (NTIS)

Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events por...

A. Hall C. Everett F. Groen S. Insley

2011-01-01

96

Cross-database analysis to identify relationships between aircraft accidents and incidents  

NASA Astrophysics Data System (ADS)

Air transportation systems are designed to ensure that aircraft accidents are rare events. To minimize these accidents, factors causing or contributing to accidents must be understood and prevented. Despite many efforts by the aviation safety community to reduce the accidents, accident rates have been stable for decades. One explanation could be that direct and obvious causes that previously occurred with a relatively high frequency have already been addressed over the past few decades. What remains is a much more difficult challenge---identifying less obvious causes, combinations of factors that, when occurring together, may potentially lead to an accident. Contributions of this research to the aviation safety community are two-fold: (1) The analyses conducted in this research, identified significant accident factors. Detection and prevention of these factors, could prevent potential future accidents. The holistic study made it possible to compare the factors in a common framework. The identified factors were compared and ranked in terms of their likelihood of being involved in accidents. Corrective actions by the FAA Aviation Safety Oversight (ASO), air carrier safety offices, and the aviation safety community in general, could target the high-ranked factors first. The aviation safety community can also use the identified factors as a benchmark to measure and compare safety levels in different periods and/or at different regions. (2) The methodology established in this study, can be used by researchers in future studies. By applying this methodology to the safety data, areas prone to future accidents can be detected and addressed. Air carriers can apply this methodology to analyze their proprietary data and find detailed safety factors specific to their operations. The Factor Support Ratio metric introduced in this research, can be used to measure and compare different safety factors. (Abstract shortened by UMI.)

Nazeri, Zohreh

97

Offsite radiological consequence analysis for the bounding aircraft crash accident  

SciTech Connect

The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

OBERG, B.D.

2003-03-22

98

Thermohydraulic and Safety Analysis for CARR Under Station Blackout Accident  

SciTech Connect

A thermohydraulic and safety analysis code (TSACC) has been developed using Fortran 90 language to evaluate the transient thermohydraulic behaviors and safety characteristics of the China Advanced Research Reactor(CARR) under Station Blackout Accident(SBA). For the development of TSACC, a series of corresponding mathematical and physical models were considered. Point reactor neutron kinetics model was adopted for solving reactor power. All possible flow and heat transfer conditions under station blackout accident were considered and the optional models were supplied. The usual Finite Difference Method (FDM) was abandoned and a new model was adopted to evaluate the temperature field of core plate type fuel element. A new simple and convenient equation was proposed for the resolution of the transient behaviors of the main pump instead of the complicated four-quadrant model. Gear method and Adams method were adopted alternately for a better solution to the stiff differential equations describing the dynamic behaviors of the CARR. The computational result of TSACC showed the enough safety margin of CARR under SBA. For the purpose of Verification and Validation (V and V), the simulated results of TSACC were compared with those of Relap5/Mdo3. The V and V result indicated a good agreement between the results by the two codes. Because of the adoption of modular programming techniques, this analysis code is expected to be applied to other reactors by easily modifying the corresponding function modules. (authors)

Wenxi Tian; Suizheng Qiu; Guanghui Su; Dounan Jia [Xi'an Jiaotong University, 28 Xianning Road, Xi'an 710049 (China); Xingmin Liu - China Institute of Atomic Energy

2006-07-01

99

Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.  

PubMed

According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

2014-04-01

100

Accident Risk Analysis and Model Applications of Railway Level Crossings  

Microsoft Academic Search

In order to reduce property loss and casualties from level crossing accidents, it is crucial to develop effective accident prediction models that are capable of providing effective information of accident frequency and severity given a vector of covariates. In the present research, a set of statistical count and categorical data models are developed; they are not only able to evaluate

Shou-Ren Hu; Kai-Han Wu

2008-01-01

101

Analysis of Accidents Involving Breakaway-Cable-Terminal End Treatments.  

National Technical Information Service (NTIS)

Fifty accidents were analyzed. The primary data base consisted of Kentucky accident records for the years 1980-82. Results showed that the breakaway-cable-terminal end treatment performed properly in most accidents (72 percent). If trucks are excluded fro...

J. G. Pigman K. R. Agent T. Creasey

1984-01-01

102

Integral Test and Engineering Analysis of Coolant Depletion During a Large-Break Loss-of-Coolant Accident  

SciTech Connect

This study concerns the development of an integrated calculation methodology with which to continually and consistently analyze the progression of an accident from the design-basis accident phase via core uncovery to the severe accident phase. The depletion rate of reactor coolant inventory was experimentally investigated after the safety injection failure during a large-break loss-of-coolant accident utilizing the Seoul National University Integral Test Facility (SNUF), which is scaled down to 1/6.4 in length and 1/178 in area from the APR1400 [Advanced Power Reactor 1400 MW(electric)]. The experimental results showed that the core coolant inventory decreased five times faster before than after the extinction of sweepout in the reactor downcomer, which is induced by the incoming steam from the intact cold legs. The sweepout occurred on top of the spillover from the downcomer region and expedited depletion of the core coolant inventory. The test result was simulated with the MAAP4 severe accident analysis code. The calculation results of the original MAAP4 deviated from the test data in terms of coolant inventory distribution in the test vessel. After the calculation algorithm of coolant level distribution was improved by including the subroutine of pseudo pressure buildup, which accounts for the differential pressure between the core and downcomer in MAAP4, the core melt progression was delayed by hundreds of seconds, and the code prediction was in reasonable agreement with the overall behavior of the SNUF experiment.

Kim, Yong Soo; Park, Chang Hwan; Bae, Byoung Uhn; Park, Goon Cherl; Suh, Kune Yull; Lee, Un Chul [Seoul National University (Korea, Republic of)

2005-02-15

103

Development of severe accident analysis code - A study on the molten core-concrete interaction under severe accidents.  

National Technical Information Service (NTIS)

The purpose of this study is to understand the phenomena of the molten core/concrete interaction during the hypothetical severe accident, and to develop the model for heat transfer and physical phenomena in MCCIs. The contents of this study are analysis o...

C. H. Jung B. C. Lee C. W. Huh D. Y. Kim J. Y. Kim

1996-01-01

104

An analysis of evacuation options for nuclear accidents  

SciTech Connect

In this report we consider the threat posed by the accidental release of radionuclides from a nuclear power plant. The objective is to establish relationships between radiation dose and the cost of evacuation under a wide variety of conditions. The dose can almost always be reduced by evacuating the population from a larger area. However, extending the evacuation zone outward will cause evacuation costs to increase. The purpose of this analysis was to provide the Environmental Protection Agency (EPA) a data base for evaluating whether implementation costs and risks averted could be used to justify evacuation at lower doses. The procedures used and results of these analyses are being made available as background information for use by others. We develop cost/dose relationships for 54 scenarios that are based upon the severity of the reactor accident, meteorological conditions during the release of radionuclides into the environment, and the angular width of the evacuation zone. The 54 scenarios are derived from combinations of three accident severity levels, six meteorological conditions and evacuation zone widths of 70{degree}, 90{degree}, and 180{degree}.

Tawil, J.J.; Strenge, D.L.; Schultz, R.W. [Battelle Memorial Inst., Richland, WA (United States)

1987-11-01

105

An Accident Precursor Analysis Process Tailored for NASA Space Systems  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

2010-01-01

106

Geographical information systems aided traffic accident analysis system case study: city of Afyonkarahisar.  

PubMed

Geographical Information System (GIS) technology has been a popular tool for visualization of accident data and analysis of hot spots in highways. Many traffic agencies have been using GIS for accident analysis. Accident analysis studies aim at the identification of high rate accident locations and safety deficient areas on the highways. So, traffic officials can implement precautionary measures and provisions for traffic safety. Since accident reports are prepared in textual format in Turkey, this situation makes it difficult to analyze accident results. In our study, we developed a system transforming these textual data to tabular form and then this tabular data were georeferenced onto the highways. Then, the hot spots in the highways in Afyonkarahisar administrative border were explored and determined with two different methods of Kernel Density analysis and repeatability analysis. Subsequently, accident conditions at these hot spots were examined. We realized that the hot spots determined with two methods reflect really problematic places such as cross roads, junction points etc. Many of previous studies introduced GIS only as a visualization tool for accident locations. The importance of this study was to use GIS as a management system for accident analysis and determination of hot spots in Turkey with statistical analysis methods. PMID:18215546

Erdogan, Saffet; Yilmaz, Ibrahim; Baybura, Tamer; Gullu, Mevlut

2008-01-01

107

Summary of the SRS Severe Accident Analysis Program, 1987--1992  

SciTech Connect

The Severe Accident Analysis Program (SAAP) is a program of experimental and analytical studies aimed at characterizing severe accidents that might occur in the Savannah River Site Production Reactors. The goals of the Severe Accident Analysis Program are: To develop an understanding of severe accidents in SRS reactors that is adequate to support safety documentation for these reactors, including the Safety Analysis Report (SAR), the Probabilistic Risk Assessment (PRA), and other studies evaluating the safety of reactor operation; To provide tools and bases for the evaluation of existing or proposed safety related equipment in the SRS reactors; To provide bases for the development of accident management procedures for the SRS reactors; To develop and maintain on the site a sufficient body of knowledge, including documents, computer codes, and cognizant engineers and scientists, that can be used to authoritatively resolve questions or issues related to reactor accidents. The Severe Accident Analysis Program was instituted in 1987 and has already produced a substantial amount of information, and specialized calculational tools. Products of the Severe Accident Analysis Program (listed in Section 9 of this report) have been used in the development of the Safety Analysis Report (SAR) and the Probabilistic Risk Assessment (PRA), and in the development of technical specifications for the SRS reactors. A staff of about seven people is currently involved directly in the program and in providing input on severe accidents to other SRS activities.

Long, T.A.; Hyder, M.L.; Britt, T.E.; Allison, D.K.; Chow, S.; Graves, R.D.; DeWald, A.B. Jr.; Monson, P.R. Jr.; Wooten, L.A.

1992-11-01

108

Risk Analysis Methodologies for the Transportation of Radioactive Materials.  

National Technical Information Service (NTIS)

Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of ...

C. A. Geffen

1983-01-01

109

An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)  

NASA Technical Reports Server (NTRS)

A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

2002-01-01

110

Aircraft Accident Prevention: Loss-of-Control Analysis  

NASA Technical Reports Server (NTRS)

The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

2009-01-01

111

Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident  

PubMed Central

In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms.

Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

2012-01-01

112

Improved Methodology Application for 12-Rad Analysis in a Shielded Facility at SRS  

SciTech Connect

The DOE Order 420.1 requires establishing 12-rad evacuation zone boundaries and installing Criticality Accident Alarm System (CAAS) per ANS-8.3 standard for facilities having a probability of criticality greater than 10-6 per year. The H-Canyon at the Savannah River Site (SRS) is one of the reprocessing facilities where SRS reactor fuels, research reactor fuels, and other fissile materials are processed and purified using a modified Purex process called H-Modified or HM Process. This paper discusses an improved methodology for 12-rad zone analysis and its implementation within this large shielded facility that has a large variety of criticality sources and scenarios.

Paul, P.

2003-01-31

113

The accident analysis of mobile mine machinery in Indian opencast coal mines.  

PubMed

This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines. PMID:23324038

Kumar, R; Ghosh, A K

2014-03-01

114

Hazard categorization and accident analysis techniques for compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports  

SciTech Connect

The purpose of this DOE Standard is to establish guidance for facility managers and Program Secretarial Officers (PSOs) and thereby help them to comply consistently and more efficiently with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports. To this end, this guidance provides the following practical information: (1) The threshold quantities of radiological material inventory below which compliance with DOE Order 5480.23 is not required. (2) The level of effort to develop the program plan and schedule required in Section 9.b. (2) of the Order, and information for making a preliminary assessment of facility hazards. (3) A uniform methodology for hazard categorization under the Order. (4) Insight into the ''graded approach'' for SAR development, especially in hazard assessment and accident analysis techniques. Individual PSOs may develop additional guidance addressing safety requirements for facilities which fall below the threshold quantities specified in this document.

None

1992-12-31

115

Otorhinolaryngologic disorders and diving accidents: an analysis of 306 divers.  

PubMed

Diving is a very popular leisure activity with an increasing number of participants. As more than 80% of the diving related problems involve the head and neck region, every otorhinolaryngologist should be familiar with diving medical standards. We here present an analysis of more than 300 patients we have treated in the past four years. Between January 2002 and October 2005, 306 patients presented in our department with otorhinological disorders after diving, or after diving accidents. We collected the following data: name, sex, age, date of treatment, date of accident, diagnosis, special aspects of the diagnosis, number of dives, diving certification, whether and which surgery had been performed, history of acute diving accidents or follow up treatment, assessment of fitness to dive and special remarks. The study setting was a retrospective cohort study. The distribution of the disorders was as follows: 24 divers (8%) with external ear disorders, 140 divers (46%) with middle ear disorders, 56 divers (18%) with inner ear disorders, 53 divers (17%) with disorders of the nose and sinuses, 24 divers (8%) with decompression illness (DCI) and 9 divers (3%) who complained of various symptoms. Only 18% of the divers presented with acute disorders. The most common disorder (24%) was Eustachian tube dysfunction. Female divers were significantly more often affected. Chronic sinusitis was found to be associated with a significantly higher number of performed dives. Conservative treatment failed in 30% of the patients but sinus surgery relieved symptoms in all patients of this group. The middle ear is the main problem area for divers. Middle ear ventilation problems due to Eustachian tube dysfunction can be treated conservatively with excellent results whereas pathology of the tympanic membrane and ossicular chain often require surgery. More than four out of five patients visited our department to re-establish their fitness to dive. Although the treatment of acute diving-related disorders is an important field for the treatment of divers, the main need of divers seems to be assessment and recovery of their fitness to dive. PMID:17639445

Klingmann, Christoph; Praetorius, Mark; Baumann, Ingo; Plinkert, Peter K

2007-10-01

116

Head Injuries in Child Pedestrian Accidents—In-Depth Case Analysis and Reconstructions  

Microsoft Academic Search

Objective. The aim of this study was to investigate head injuries, injury risks, and corresponding tolerance levels of children in car-to–child pedestrian collisions.Methods. An in-depth accident analysis was carried out based on 23 accident cases involving child pedestrians. These cases were collected with detailed information about pedestrians, cars, and road environments. All 23 accidents were reconstructed using the MADYMO program

Jianfeng Yao; Jikuang Yang; Dietmar Otte

2007-01-01

117

MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents  

SciTech Connect

The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

Foppe, T.L.; Peterson, V.L.

1993-10-01

118

PWR integrated safety analysis methodology using multi-level coupling algorithm  

NASA Astrophysics Data System (ADS)

Coupled three-dimensional (3D) neutronics/thermal-hydraulic (T-H) system codes give a unique opportunity for a realistic modeling of the plant transients and design basis accidents (DBA) occurring in light water reactors (LWR). Examples of such DBAs are the rod ejection accidents (REA) and the main steam line break (MSLB) that constitute the bounding safety problems for pressurized water reactors (PWR). These accidents involve asymmetric 3D spatial neutronic and T-H effects during the course of the transients. The thermal margins (the peak fuel temperature, and departure from nucleate boiling ratio (DNBR)) are the measures of safety at a particular transient and need to be evaluated as accurate as possible. Modern 3D neutronics/T-H coupled codes estimate the safety margins coarsely on an assembly level, i.e. for an average fuel pin. More accurate prediction of the safety margins requires the evaluation of the transient fuel rod response involving locally coupled neutronics/T-H calculations. The proposed approach is to perform an on-line hot-channel safety analysis not for the whole core but for a selected local region, for example for the highest power loaded fuel assembly. This approach becomes feasible if an on-line algorithm capable to extract the necessary input data for a sub-channel module is available. The necessary input data include the detailed pin-power distributions and the T-H boundary conditions for each sub-channel in the considered problem. Therefore, two potential challenges are faced in the development of refined methodology for evaluation of local safety parameters. One is the development of an efficient transient pin-power reconstruction algorithm with a consistent cross-section modeling. The second is the development of a multi-level coupling algorithm for the T-H boundary and feed-back data exchange between the sub-channel module and the main 3D neutron kinetics/T-H system code, which already uses one level of coupling scheme between 3D neutronics and core thermal-hydraulics models. The major accomplishment of the thesis is the development of an integrated PWR safety analysis methodology with locally refined safety evaluations. This involved introduction of an improved method capable of efficiently restoring the fine pin-power distribution with a high degree of accuracy. In order to apply the methodology to evaluate the safety margins on a pin level, a refined on-line hot channel model was developed accounting for the cross-flow effects. Finally, this methodology was applied to best estimate safety analysis to more accurately calculate the thermal safety margins occurring during a design basis accident in PWR.

Ziabletsev, Dmitri Nickolaevich

119

Social Network Analysis in Human Resource Development: A New Methodology  

Microsoft Academic Search

Through an exhaustive review of the literature, this article looks at the applicability of social network analysis (SNA) in the field of humanresource development. The literature review revealed that a number of disciplines have adopted this unique methodology, which has assisted in the development of theory. SNA is a methodology for examining the structure among actors, groups, and organizations and

John-Paul Hatala

2006-01-01

120

GPHS-RTG launch accident analysis for Galileo and Ulysses  

NASA Astrophysics Data System (ADS)

This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. NASA provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

Bradshaw, C. T.

121

Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures  

NASA Technical Reports Server (NTRS)

This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

Batthauer, Byron E.

1987-01-01

122

Injury patterns of seniors in traffic accidents: A technical and medical analysis  

PubMed Central

AIM: To investigate the actual injury situation of seniors in traffic accidents and to evaluate the different injury patterns. METHODS: Injury data, environmental circumstances and crash circumstances of accidents were collected shortly after the accident event at the scene. With these data, a technical and medical analysis was performed, including Injury Severity Score, Abbreviated Injury Scale and Maximum Abbreviated Injury Scale. The method of data collection is named the German In-Depth Accident Study and can be seen as representative. RESULTS: A total of 4430 injured seniors in traffic accidents were evaluated. The incidence of sustaining severe injuries to extremities, head and maxillofacial region was significantly higher in the group of elderly people compared to a younger age (P < 0.05). The number of accident-related injuries was higher in the group of seniors compared to other groups. CONCLUSION: Seniors are more likely to be involved in traffic injuries and to sustain serious to severe injuries compared to other groups.

Brand, Stephan; Otte, Dietmar; Mueller, Christian Walter; Petri, Maximilian; Haas, Philipp; Stuebig, Timo; Krettek, Christian; Haasper, Carl

2012-01-01

123

Methodological Issues in the Content Analysis of Computer Conference Transcripts  

Microsoft Academic Search

This paper discusses the potential and the methodological challenges of analyzing computer conference transcripts using quantitative content analysis. The paper is divided into six sections, which discuss: criteria for content analysis, research designs, types of content, units of analysis, ethical issues, and software to aid analysis. The discussion is supported with a survey of 19 commonly referenced studies published during

Liam Rourke; Terry Anderson; D. R. Garrison; Walter Archer

2001-01-01

124

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. D. Harrison; F. T. Harper; S. C. Hora

1998-01-01

125

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. Boardman; J. A. Jones; F. T. Harper; M. L. Young; S. C. Hora

1997-01-01

126

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The

F. T. Harper; M. L. Young; L. A. Miller; S. C. Hora; C. H. Lui; L. H. J. Goossens; R. M. Cooke; J. Paesler-Sauer; J. C. Helton

1995-01-01

127

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. D. Harrison; F. T. Harper; S. C. Hora

1998-01-01

128

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

M. P. Little; C. R. Muirhead; L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; F. T. Harper; S. C. Hora

1997-01-01

129

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

M. P. Little; C. R. Muirhead; L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; F. T. Harper; S. C. Hora

1997-01-01

130

DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)  

SciTech Connect

This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

Young, K. R.; Augustine, C.; Anderson, A.

2010-02-01

131

Site-specific meteorology identification for DOE facility accident analysis  

SciTech Connect

Currently, chemical dispersion calculations performed for safety analysis of DOE facilities assume a Pasquill D-Stability Class with a 4.5 m/s windspeed. These meteorological conditions are assumed to conservatively address the source term generation mechanism as well as the dispersion mechanism thereby resulting in a net conservative downwind consequence. While choosing this Stability Class / Windspeed combination may result in an overall conservative consequence, the level of conservative can not be quantified. The intent of this paper is to document a methodology which incorporates site-specific meteorology to determine a quantifiable consequence of a chemical release. A five-year meteorological database, appropriate for the facility location, is utilized for these chemical consequence calculations, and is consistent with the approach used for radiological releases. The hourly averages of meteorological conditions have been binned into 21 groups for the chemical consequence calculations. These 21 cases each have a probability of occurrence based on the number of times each case has occurred over the five year sampling period. A code has been developed which automates the running of all the cases with a commercially available air modeling code. The 21 cases are sorted by concentration. A concentration may be selected by the user for a quantified level of conservatism. The methodology presented is intended to improve the technical accuracy and defensability of Chemical Source Term / Dispersion Safety Analysis work. The result improves the quality of safety analyses products without significantly increasing the cost.

Rabin, S.B.

1995-09-01

132

Risks due to beyond design base accidents of nuclear power plants in Europe—the methodology of riskmap  

Microsoft Academic Search

International treaties on liability in the case of nuclear accidents set a limit on the repair payments to be made by the operators of nuclear power plants to countries adversely affected by nuclear fall-out which is independent of the actual risk incurred by the individual countries. A map of the risk due to beyond design base accidents of nuclear power

Iouli Andreev; Markus Hittenberger; Peter Hofer; Helga Kromp-Kolb; Wolfgang Kromp; Petra Seibert; Gerhard Wotawa

1998-01-01

133

Rail transportation risk and accident severity: A statistical analysis of variables in FRA's accident/incident data base  

SciTech Connect

The Federal Railroad Administration (US DOT) maintains a file of carrier-reported railroad accidents and incidents that meet stipulated threshold criteria for damage cost and/or casualties. A thoroughly-cleaned five-year time series of this data base was subjected to unbiased statistical procedures to discover (a) important causative variables in severe (high damage cost) accidents and (b) other key relationships between objective accident conditions and frequencies. Just under 6000 records, each representing a single event involving rail freight shipments moving on mainline track, were subjected to statistical frequency analysis, then included in the construction of classification and regression trees as described by Breimann et al. (1984). Variables related to damage cost defined the initial splits,'' or branchings of the tree. An interesting implication of the results of this analysis with respect to transportation of hazardous wastes by rail is that movements should be avoided when ambient temperatures are extreme (significantly < 20{degrees} or > 80{degrees}F), but that there should be no a priori bias against shipping wastes in longer train consists. 2 refs., 2 figs., 12 tabs.

Saricks, C.L. (Argonne National Lab., IL (USA). Energy Systems Div.); Janssen, I. (Argonne National Lab., IL (USA). Biological and Medical Research Div.)

1991-01-01

134

Status of Accident Analysis for Fast Breeder Reactors.  

National Technical Information Service (NTIS)

There are still considerable deficiencies in computational tools available even for following accidents to initial disassembly. Present indications are that such a disassembly will be mild, without much sensitivity of this result to modeling assumptions, ...

H. H. Hummel

1975-01-01

135

Collection and Analysis of Work Surface Accident Profile Data.  

National Technical Information Service (NTIS)

Accident Circumstance Profiles were developed to describe the tasks, work surfaces, footwear, and industries that characterize frequent and serious work surface related injuries. Injury statistics from worker's compensation records were analyzed, and fiel...

1977-01-01

136

Accident investigation: Analysis of aircraft motions from ATC radar recordings  

NASA Technical Reports Server (NTRS)

A technique was developed for deriving time histories of an aircraft's motion from air traffic control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data (from an onboard Mode-C transponder), to derive an expanded set of data which includes airspeed, lift, thrust-drag, attitude angles (pitch, roll, and heading), etc. This method of analyzing aircraft motions was evaluated through flight experiments which used the CV-990 research aircraft and recordings from both the enroute and terminal ATC radar systems. The results indicate that the values derived from the ATC radar records are for the most part in good agreement with the corresponding values obtained from airborne measurements. In an actual accident, this analysis of ATC radar records can complement the flight-data recorders, now onboard airliners, and provide a source of recorded information for other types of aircraft that are equipped with Mode-C transponders but not with onboard recorders.

Wingrove, R. C.

1976-01-01

137

BNL severe accident sequence experiments and analysis program  

SciTech Connect

A major source of containment pressurization during severe accidents is the transfer of stored energy from the hot core material to available cooling water. One mode of thermal interaction involves the quench of superheated beds of debris which could be present in the reactor cavity following melt-through or failure of the reactor vessel. This work supports development of models of superheated bed quench phenomena which are to be incorporated into containment analysis computer codes such as MARCH, CONTAIN, and MEDICI. A program directed towards characterization of the behavior of superheated debris beds has been completed. This work addressed the quench of superheated debris which is postulated to exist in the reactor cavity of a PWR following melt ejection from the primary system. The debris is assumed to be cooled by a pool of water overlying the bed of hot debris. This work has led to the development of models to predict rate of steam generation during the quench process and, in addition, the ability to assess the coolability of the debris during the transient quench process. A final report on this work has been completed. This report presents a brief description of some relevant results and conclusions. 15 refs.

Greene, G.A.; Ginsberg, T.; Tutu, N.K.

1985-01-01

138

Analysis Methodology for Semiconductor Yield by Data Mining  

Microsoft Academic Search

The conventional semiconductor yield analysis is a hypothesis verification process, which heavily depends on engineers' knowledge. Data mining methodology, on the other hand, is a hypothesis discovery process that is free from this constraint. This paper proposes a data mining method for semiconductor yield analysis, which consists of the following two phases: discovering hypothetical failure causes by regression tree analysis

Hidetaka Tsuda; Hidehiro Shirai; Masahiro Terabe; Kazuo Hashimoto; Ayumi Shinohara

2009-01-01

139

Analysis and methodology for aeronautical systems technology program planning  

NASA Technical Reports Server (NTRS)

A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

White, M. J.; Gershkoff, I.; Lamkin, S.

1983-01-01

140

Analysis of fission product revaporization in a BWR reactor cooling system during a station blackout accident  

SciTech Connect

A preliminary analysis of the re-evaporization of volatile fission product from a boiling water reactor (BWR) cooling system following a core meltdown accident in which the core debris penetrates the reactor vessel has been performed. The BWR analyzed has a Mark I containment and the accident sequence was a station blackout transient. This work was performed as part of the phenomenological uncertainty study of the Quantification and Uncertainty Analysis of Source Terms for Severe Accidents program at Brookhaven National Laboratory. Fission product re-evaporization was identified as one of the important issues in the Reactor Risk Reference Document.

Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

1988-01-01

141

Human Factors Root Cause Analysis of Accidents/Incidents Involving Remote Control Locomotive Operations.  

National Technical Information Service (NTIS)

This report presents findings from a human factors root cause analysis (RCA) of six train accidents/incidentscollisions, derailments, and employee injuriesthat involved remote control locomotive (RCL) operations in U.S. railroad switching yards. Descripti...

S. Reinach A. Viale

2006-01-01

142

Analysis of U. S. Navy Major Aircraft Accident Rates by Aircraft Type.  

National Technical Information Service (NTIS)

An analysis of U. S. Navy major aircraft accidents during the period Fiscal Year 1972-1974 was conducted. Forward (stepwise) Multiple Regression techniques were employed on a group of ten basic variables considered time dependent. The multiple regression ...

G. F. Johnson

1976-01-01

143

Statistical Analysis of Accident Data as a Basis for Planning Selective Enforcement - Phase II.  

National Technical Information Service (NTIS)

The report covers the second year's activities of the project, 'Analysis of Accident Data as a Basis for Planning Selective Enforcement.' Three independent topics are covered in this report. The first topic is concerned with the development of a manpower ...

G. R. Fisher W. W. Mosher

1967-01-01

144

Safety analysis results for cryostat ingress accidents in ITER  

SciTech Connect

Accidents involving the ingress of air or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

1996-12-31

145

Safety analysis results for cryostat ingress accidents in ITER  

SciTech Connect

Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits. 6 refs., 2 figs., 1 tab.

Merrill, B.J.; Cadwallader, L.C.; Petti, D.A. [Idaho National Engineering Lab., ID (United States)] [Idaho National Engineering Lab., ID (United States)

1997-06-01

146

Safety Analysis Results for Cryostat Ingress Accidents in ITER  

NASA Astrophysics Data System (ADS)

Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

Merrill, B. J.; Cadwallader, L. C.; Petti, D. A.

1997-06-01

147

Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor  

NASA Astrophysics Data System (ADS)

The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

Yulianti, Yanti; Su'Ud, Zaki; Waris, Abdul; Khotimah, S. N.; Shafii, M. Ali

2010-12-01

148

Mathematical analysis of super-resolution methodology  

Microsoft Academic Search

The attainment of super resolution (SR) from a sequence of degraded undersampled images could be viewed as reconstruction of the high-resolution (HR) image from a finite set of its projections on a sampling lattice. This can then be formulated as an optimization problem whose solution is obtained by minimizing a cost function. The approaches adopted and their analysis to solve

M. K. Ng; N. K. Bose

2003-01-01

149

A methodology for probabilistic fault displacement hazard analysis (PFDHA)  

USGS Publications Warehouse

We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

Youngs, R. R.; Arabasz, W. J.; Anderson, R. E.; Ramelli, A. R.; Ake, J. P.; Slemmons, D. B.; McCalpin, J. P.; Doser, D. I.; Fridrich, C. J.; Swan, III, F. H.; Rogers, A. M.; Yount, J. C.; Anderson, L. W.; Smith, K. D.; Bruhn, R. L.; Knuepfer, P. L. K.; Smith, R. B.; DePolo, C. M.; O'Leary, D. W.; Coppersmith, K. J.; Pezzopane, S. K.; Schwartz, D. P.; Whitney, J. W.; Olig, S. S.; Toro, G. R.

2003-01-01

150

Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence  

NASA Technical Reports Server (NTRS)

Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

2004-01-01

151

Cost benefit analysis of information systems: a survey of methodologies  

Microsoft Academic Search

Cost justification has become one of the most important factors influencing the pace of business automation, particularly end user computing. The primary difficulty in cost justification is the evaluation of benefits. This paper identifies and discusses eight methodologies which have evolved to quantify the benefits of information systems. These are: decision analysis, cost displacement\\/avoidance, structural models, cost of effectiveness analysis,

Peter G. SassoneIl

1988-01-01

152

ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS  

SciTech Connect

This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

WILLIAMS, J.C.

2003-11-15

153

Sensitivity analysis of radionuclides atmospheric dispersion following the Fukushima accident  

NASA Astrophysics Data System (ADS)

Atmospheric dispersion models are used in response to accidental releases with two purposes: - minimising the population exposure during the accident; - complementing field measurements for the assessment of short and long term environmental and sanitary impacts. The predictions of these models are subject to considerable uncertainties of various origins. Notably, input data, such as meteorological fields or estimations of emitted quantities as function of time, are highly uncertain. The case studied here is the atmospheric release of radionuclides following the Fukushima Daiichi disaster. The model used in this study is Polyphemus/Polair3D, from which derives IRSN's operational long distance atmospheric dispersion model ldX. A sensitivity analysis was conducted in order to estimate the relative importance of a set of identified uncertainty sources. The complexity of this task was increased by four characteristics shared by most environmental models: - high dimensional inputs; - correlated inputs or inputs with complex structures; - high dimensional output; - multiplicity of purposes that require sophisticated and non-systematic post-processing of the output. The sensitivities of a set of outputs were estimated with the Morris screening method. The input ranking was highly dependent on the considered output. Yet, a few variables, such as horizontal diffusion coefficient or clouds thickness, were found to have a weak influence on most of them and could be discarded from further studies. The sensitivity analysis procedure was also applied to indicators of the model performance computed on a set of gamma dose rates observations. This original approach is of particular interest since observations could be used later to calibrate the input variables probability distributions. Indeed, only the variables that are influential on performance scores are likely to allow for calibration. An indicator based on emission peaks time matching was elaborated in order to complement classical statistical scores which were dominated by deposit dose rates and almost insensitive to lower atmosphere dose rates. The substantial sensitivity of these performance indicators is auspicious for future calibration attempts and indicates that the simple perturbations used here may be sufficient to represent an essential part of the overall uncertainty.

Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien

2014-05-01

154

Analysis of traffic accidents on rural highways using Latent Class Clustering and Bayesian Networks.  

PubMed

One of the principal objectives of traffic accident analyses is to identify key factors that affect the severity of an accident. However, with the presence of heterogeneity in the raw data used, the analysis of traffic accidents becomes difficult. In this paper, Latent Class Cluster (LCC) is used as a preliminary tool for segmentation of 3229 accidents on rural highways in Granada (Spain) between 2005 and 2008. Next, Bayesian Networks (BNs) are used to identify the main factors involved in accident severity for both, the entire database (EDB) and the clusters previously obtained by LCC. The results of these cluster-based analyses are compared with the results of a full-data analysis. The results show that the combined use of both techniques is very interesting as it reveals further information that would not have been obtained without prior segmentation of the data. BN inference is used to obtain the variables that best identify accidents with killed or seriously injured. Accident type and sight distance have been identify in all the cases analysed; other variables such as time, occupant involved or age are identified in EDB and only in one cluster; whereas variables vehicles involved, number of injuries, atmospheric factors, pavement markings and pavement width are identified only in one cluster. PMID:23182777

de Oña, Juan; López, Griselda; Mujalli, Randa; Calvo, Francisco J

2013-03-01

155

Analysis of Kuosheng Station Blackout Accident Using MELCOR 1.8.4  

SciTech Connect

The MELCOR code, developed by Sandia National Laboratories, is a fully integrated, relatively fast-running code that models the progression of severe accidents in commercial light water nuclear power plants (NPPs).A specific station blackout (SBO) accident for Kuosheng (BWR-6) NPP is simulated using the MELCOR 1.8.4 code. The MELCOR input deck for Kuosheng NPP is established based on Kuosheng NPP design data and the MELCOR users' guides. The initial steady-state conditions are generated with a developed self-initialization algorithm. The main severe accident phenomena and the fission product release fractions associated with the SBO accident were simulated. The predicted results are plausible and as expected in light of current understanding of severe accident phenomena. The uncertainty of this analysis is briefly discussed. The important features of the MELCOR 1.8.4 are described. The estimated results provide useful information for the probabilistic risk assessment (PRA) of Kuosheng NPP. This tool will be applied to the PRA, the severe accident analysis, and the severe accident management study of Kuosheng NPP in the near future.

Wang, S.-J.; Chien, C.-S.; Wang, T.-C.; Chiang, K.-S

2000-11-15

156

Uncertainty Analyses Using the MELCOR Severe Accident Analysis Code  

Microsoft Academic Search

The MELCOR code is a detailed system level computer code capable of performing integrated self- consistent analyses of severe accident progression in commercial nuclear power plants, supporting level 2 probablilistic risk assessment (PRA) studies. Originally developed as a fast running tool with simplified models for performing probabilistic safety analyses and sensitivity studies, MELCOR now employs largely best estimate models for

Randall O. Gauntt

157

Extension of ship accident analysis to multiple-package shipments.  

National Technical Information Service (NTIS)

Severe ship accidents and the probability of radioactive material (RAM) release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one ...

G. S. Mills K. S. Neuhauser

1997-01-01

158

Extension of ship accident analysis to multiple-package shipments  

Microsoft Academic Search

Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO

G. S. Mills; K. S. Neuhauser

1997-01-01

159

Hypothetical accident conditions thermal analysis of the 5320 package.  

National Technical Information Service (NTIS)

An axisymmetric model of the 5320 package was created to perform hypothetical accident conditions (HAC) thermal calculations. The analyses assume the 5320 package contains 359 grams of plutonium-238 (203 Watts) in the form of an oxide powder at a minimum ...

S. J. Hensel R. J. Gromada

1995-01-01

160

Discourse analysis: an exploration of methodological issues and a call for methodological courage in the field of policy analysis  

Microsoft Academic Search

In light of the frequent, yet often divergent, uses of the term ‘discourse analysis’ and the recurrent misunderstandings associated with it, this article seeks to clarify the central postulates of poststructuralist discourse theory and to raise critical methodological issues associated with it. More specifically, this article explores the usefulness of discourse theory for the analysis of policy (discourses) by way

Katharina T. Paul

2009-01-01

161

NMR methodologies in the analysis of blueberries.  

PubMed

An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-?-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering. PMID:24668393

Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

2014-06-01

162

Four applications of a software data collection and analysis methodology  

NASA Technical Reports Server (NTRS)

The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

Basili, Victor R.; Selby, Richard W., Jr.

1985-01-01

163

Advanced Power Plant Development and Analysis Methodologies  

SciTech Connect

Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

2006-06-30

164

A Global Sensitivity Analysis Methodology for Multi-physics Applications  

SciTech Connect

Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

Tong, C H; Graziani, F R

2007-02-02

165

Thermal-stress analysis of a Fort St. Vrain core-support block under accident conditions  

SciTech Connect

A thermoelastic stress analysis of a graphite core support block in the Fort St. Vrain High Temperature Gas Cooled Reactor is described. The support block is subjected to thermal stresses caused by a loss of forced circulation accident of the reactor system. Two- and three-dimensional finite element models of the core support block are analyzed using the ADINAT and ADINA codes, and results are given that verify the integrity of this structural component under the given accident condition.

Carruthers, L.M.; Butler, T.A.; Anderson, C.A.

1982-01-01

166

'Doing' health policy analysis: methodological and conceptual reflections and challenges  

PubMed Central

The case for undertaking policy analysis has been made by a number of scholars and practitioners. However, there has been much less attention given to how to do policy analysis, what research designs, theories or methods best inform policy analysis. This paper begins by looking at the health policy environment, and some of the challenges to researching this highly complex phenomenon. It focuses on research in middle and low income countries, drawing on some of the frameworks and theories, methodologies and designs that can be used in health policy analysis, giving examples from recent studies. The implications of case studies and of temporality in research design are explored. Attention is drawn to the roles of the policy researcher and the importance of reflexivity and researcher positionality in the research process. The final section explores ways of advancing the field of health policy analysis with recommendations on theory, methodology and researcher reflexivity.

Walt, Gill; Shiffman, Jeremy; Schneider, Helen; Murray, Susan F; Brugha, Ruairi; Gilson, Lucy

2008-01-01

167

Analysis of general-aviation accidents using ATC radar records  

NASA Technical Reports Server (NTRS)

It is pointed out that general aviation aircraft usually do not carry flight recorders, and in accident investigations the only available data may come from the Air Traffic Control (ATC) records. A description is presented of a technique for deriving time-histories of aircraft motions from ATC radar records. The employed procedure involves a smoothing of the raw radar data. The smoothed results, in combination with other available information (meteorological data and aircraft aerodynamic data) are used to derive the expanded set of motion time-histories. Applications of the considered analytical methods are related to different types of aircraft, such as light piston-props, executive jets, and commuter turboprops, as well as different accident situations, such as takeoff, climb-out, icing, and deep stall.

Wingrove, R. C.; Bach, R. E., Jr.

1982-01-01

168

Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory  

SciTech Connect

A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

Kim, S.H.; Taleyarkhan, R.P.

1994-01-01

169

THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT  

SciTech Connect

Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

Gupta, N.

2011-02-14

170

Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology  

ERIC Educational Resources Information Center

The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

2004-01-01

171

Q-Analysis: A Methodology for Librarianship and Information Science.  

ERIC Educational Resources Information Center

Introduces Q-analysis, a methodology for investigating a wide range of structural phenomena that defines structures in terms of relations between members of sets and reveals their salient features by using techniques of algebraic topology. Applications relevant to librarianship and information science are reviewed and present limitations…

Davies, Roy

1985-01-01

172

Cost prediction using decision\\/risk analysis methodologies  

Microsoft Academic Search

The ability to make good cost predictions is a very important aspect of the construction process. The methods used have historically relied on subjective judgements based on data which have proved satisfactory. The formal methodologies of decision\\/risk analysis have been little used in construction. This paper looks at how the use of decision trees, utility theory and the Monte Carlo

James Birnie; Alan Yates

1991-01-01

173

Methodology for Analysis of IAI District Level Data Bases.  

ERIC Educational Resources Information Center

Instructional Accomplishment Information (IAI) Systems data bases provide the opportunity for new and powerful studies relevant to educational policy issues at a local and/or national level. This report discusses the methodology for "schooling policy studies." The procedures are illustrated using a yet-to-be-completed analysis of the Los Angeles…

Milazzo, Patricia; And Others

174

Human Error and Commercial Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS (Human Factors Analysis and Classification System).  

National Technical Information Service (NTIS)

The Human Factors Analysis and Classification System (HFACS) is a theoretically based tool for investigating and analyzing human error associated with accidents and incidents. Previous research has shown that HFACS can be reliably used to identify general...

A. Boquet C. Detwiler C. Hackworth K. Holcomb S. Shappell

2006-01-01

175

Human Error and General Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS (Human Factors Analysis and Classification System).  

National Technical Information Service (NTIS)

The Human Factors Analysis and Classification System (HFACS) is a theoretically based tool for investigating and analyzing human error associated with accidents and incidents. Previous research performed at both the University of Illinois and the Civil Ae...

A. Boquet C. Detwiler D. Wiegmann K. Holcomb T. Faaborg

2005-01-01

176

Siting MSW landfills with a spatial multiple criteria analysis methodology.  

PubMed

The present work describes a spatial methodology which comprises several methods from different scientific fields such as multiple criteria analysis, geographic information systems, spatial analysis and spatial statistics. The final goal of the methodology is to evaluate the suitability of the study region in order to optimally site a landfill. The initial step is the formation of the multiple criteria problem's hierarchical structure. Then the methodology utilizes spatial analysis processes to create the evaluation criteria, which are mainly based on Greek and EU legislation, but are also based on international practice and practical guidelines. The relative importance weights of the evaluation criteria are estimated using the analytic hierarchy process. With the aid of the simple additive weighting method, the suitability for landfill siting of the study region is finally evaluated. The resulting land suitability is reported on a grading scale of 0-10, which is, respectively, from least to most suitable areas. The last step is a spatial clustering process, which is being performed in order to reveal the most suitable areas, allowing an initial ranking and selection of candidate landfill sites. The application of the presented methodology in the island of Lemnos in the North Aegean Sea (Greece) indicated that 9.3% of the study region is suitable for landfill siting with grading values greater than 9. PMID:15946837

Kontos, Themistoklis D; Komilis, Dimitrios P; Halvadakis, Constantinos P

2005-01-01

177

Traffic Analysis Toolbox Volume II. Decision Support Methodology for Selecting Traffic Analysis Tools.  

National Technical Information Service (NTIS)

This report provides an overview of the role of traffic analysis tools in the transportation analysis process and provides a detailed decision support methodology for selecting the appropriate type of analysis tool for the job at hand. An introduction to ...

K. Jeannotte A. Chandra V. Alexiadis A. Skabardonis

2004-01-01

178

Methodological Variability Using Electronic Nose Technology For Headspace Analysis  

NASA Astrophysics Data System (ADS)

Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

Knobloch, Henri; Turner, Claire; Spooner, Andrew; Chambers, Mark

2009-05-01

179

National Center for Statistics and Analysis Collected Technical Studies, Volume 2. Accident Data Analysis of Occupant Injuries and Crash Characteristics - Eight Papers.  

National Technical Information Service (NTIS)

The eight papers in this volume analyze accident data to address various questions concerning traffic accident injuries. All eight were written by members of the Crashworthiness Group, Mathematical Analysis Division, National Center for Statistics and Ana...

D. Najjar N. Bondy S. Partyka

1981-01-01

180

A posteriori error analysis with applications to finite element methodology and finite difference methodology  

SciTech Connect

A problem of continued interest for many years has been the capacitance of the regular polyhedra. A new numerical value for each of these regular solids is presented, obtained by employing a finite difference technique. In addition, the theory of an established method is presented for computing point-wise truncation error estimates a posteriori. This method is applied to the capacitance solution of two of the regular polyhedra, the tetrahedron and the cube, to give point-wise error estimates for the solution. A new technique for extending the error analysis method to finite element methodology is introduced and applications given for plane structural problems on a clamped plate and a cantilevered plate. The latter of these problems involves mixed boundary conditions, which are rarely addressed by error analysis methods. An important feature of the error analysis method presented is that the results do not require the solution of a higher order h or p version.

Brown, C.S.

1989-01-01

181

Trial application of the worker safety assessment methodology  

SciTech Connect

A Worker Safety Assessment Methodology has been developed to assess the risks to workers from radiological accidents at non-reactor nuclear facilities. The methodology utilizes Process Hazards Analysis, proposed risk goals, and Quantitative Risk Analysis. The first phase of a trial application of the methodology to a nuclear facility has been completed and is being reports.

Marchese, A.R. [USDOE, Washington, DC (United States); Neogy, P. [Brookhaven National Lab., Upton, NY (United States)

1995-12-31

182

Radioactivity analysis following the Fukushima Dai-ichi nuclear accident.  

PubMed

A total of 118 samples were analyzed using HPGe ?-spectrometry. (131)I, (134)Cs, (137)Cs and (136)Cs were detected in aerosol air samples that were collected 22 days after the accident with values of 1720 µBq m(-)³, 247 µBq m(-)³, 289 µBq m(-)³ and 23 µBq m(-)³, respectively. (131)I was detected in rainwater and soil samples and was also measurable in vegetables collected between April 2 and 13, 2011, with values ranging from 0.55 Bq kg(-1) to 2.68 Bq kg(-1). No (131)I was detected in milk, drinking water, seawater or marine biota samples. PMID:23685724

Tuo, Fei; Xu, Cuihua; Zhang, Jing; Zhou, Qiang; Li, Wenhong; Zhao, Li; Zhang, Qing; Zhang, Jianfeng; Su, Xu

2013-08-01

183

Assessment of groundwater contamination resulting from a major accident in land nuclear power plants (LNPP), I: Concepts and methodology  

NASA Astrophysics Data System (ADS)

Hydrological site suitability is examined on the basis of potential groundwater pollution associated with major hypothetical accidents of reasonable probability. Loss of Coolant Accident (LOCA) is considered here as the Maximum Design Basis Event in nuclear power plants. Two alternative nuclide paths, resulting in groundwater contamination are considered: (a) core penetration through the basement, bringing possibly a major part of the nuclide inventory of the reactor into a direct contact with underlying groundwaters, or alternatively (b) major nuclide releases to the atmosphere, resulting in their wide spread as fallout, thus endangering the exploitability of underlying aquifers over large areas. These are referred to commonly as point-source and diffused-source contamination, respectively. Contamination analyses, related to the point-source scenario, are derived according to known analytical solutions of the convection-dispersion differential equation for absorbable and decaying species.

Mercado, Abraham

1989-12-01

184

Comparative analysis of EPA cost-benefit methodologies  

SciTech Connect

In recent years, reforming the regulatory process has received much attention from diverse groups such as environmentalists, the government, and industry. A cost-benefit analysis can be a useful way to organize and compare the favorable and unfavorable impacts a proposed action night have on society. Since 1981, two Executive Orders have required the U.S. Environmental Protection Agency (EPA) and other regulatory agencies to perform cost-benefit analyses in support of regulatory decision making. At the EPA, a cost-benefit analysis is published as a document called a regulatory impact analysis (RIA). This report reviews cost-benefit methodologies used by three EPA program offices: Office of Air and Radiation, Office of Solid Waste, and Office of Water. These offices were chosen because they promulgate regulations that affect the policies of this study`s sponsor (U.S. Department of Energy, Office of Fossil Energy) and the technologies it uses. The study was conducted by reviewing 11 RIAs recently published by the three offices and by interviewing staff members in the offices. To draw conclusions about the EPA cost-benefit methodologies, their components were compared with those of a standard methodology (i.e., those that should be included in a comprehensive cost-benefit methodology). This study focused on the consistency of the approaches as well as their strengths and weaknesses, since differences in the cost-benefit methodologies themselves or in their application can cause confusion and preclude consistent comparison of regulations both within and among program offices.

Poch, L.; Gillette, J.; Veil, J.

1998-05-01

185

Quantitative analysis of ATM safety issues using retrospective accident data: The dynamic risk modelling project  

Microsoft Academic Search

The “dynamic risk modelling project” was a research activity aimed at developing a simulation approach able to provide a quantitative analysis of some critical activities of air traffic control (ATC) operators considering the organizational context in which they take place, the main cognitive processes underneath, and the possibility to inform the analysis using retrospective accident data.The pilot study presented in

Maria Chiara Leva; Massimiliano De Ambroggi; Daniela Grippa; Randall De Garis; Paolo Trucco; Oliver Sträter

2009-01-01

186

Atmospheric transport analysis used in hazard screening methodology  

SciTech Connect

Simple, but conservative, atmospheric transport models are used in the initial stages of a hazard screening methodology to determine a preliminary hazard rank. The hazard rank is one indicator of the additional effort, if any, that must be applied to determine if a system is safe. Simple methods avoid prolonged calculations at this early stage when details of potential accidents may be poorly defined. The models are used to simulate the consequences resulting from accidental releases of toxic substances. Instantaneous and constant-rate releases are considered. If a release takes place within a relatively small enclosure, the close-in transport is approximated by assuming the airborne material is instantaneously mixed with the volume of air within this enclosure. For all other situations and large distances, the transport is estimated with simple atmospheric dispersion models using published values of dispersion coefficients for large distances, and values based on turbulent diffusion theory for close-in distances. Consequences are assessed by defining exposure levels that are equivalent to negligible, reversible, and irreversible health effects. The hazard rank is related to the number and location of people within each category of health effects.

Bloom, S.G.

1992-06-29

187

Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.  

PubMed

The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS. PMID:22317372

Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

2012-01-01

188

Rail transportation risk and accident severity: A statistical analysis of variables in FRA's accident\\/incident data base  

Microsoft Academic Search

The Federal Railroad Administration (US DOT) maintains a file of carrier-reported railroad accidents and incidents that meet stipulated threshold criteria for damage cost and\\/or casualties. A thoroughly-cleaned five-year time series of this data base was subjected to unbiased statistical procedures to discover (a) important causative variables in severe (high damage cost) accidents and (b) other key relationships between objective accident

C. L. Saricks; I. Janssen

1991-01-01

189

Application of Latin hypercube sampling to RADTRAN 4 truck accident risk sensitivity analysis  

SciTech Connect

The sensitivity of calculated dose estimates to various RADTRAN 4 inputs is an available output for incident-free analysis because the defining equations are linear and sensitivity to each variable can be calculated in closed mathematical form. However, the necessary linearity is not characteristic of the equations used in calculation of accident dose risk, making a similar tabulation of sensitivity for RADTRAN 4 accident analysis impossible. Therefore, a study of sensitivity of accident risk results to variation of input parameters was performed using representative routes, isotopic inventories, and packagings. It was determined that, of the approximately two dozen RADTRAN 4 input parameters pertinent to accident analysis, only a subset of five or six has significant influence on typical analyses or is subject to random uncertainties. These five or six variables were selected as candidates for Latin Hypercube Sampling applications. To make the effect of input uncertainties on calculated accident risk more explicit, distributions and limits were determined for two variables which had approximately proportional effects on calculated doses: Pasquill Category probability (PSPROB) and link population density (LPOPD). These distributions and limits were used as input parameters to Sandia`s Latin Hypercube Sampling code to generate 50 sets of RADTRAN 4 input parameters used together with point estimates of other necessary inputs to calculate 50 observations of estimated accident dose risk.Tabulations of the RADTRAN 4 accident risk input variables and their influence on output plus illustrative examples of the LHS calculations, for truck transport situations that are typical of past experience, will be presented .

Mills, G.S.; Neuhauser, K.S.; Kanipe, F.L.

1994-12-31

190

Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.  

PubMed

Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

2006-09-01

191

Fission product transport analysis in a loss of decay heat removal accident at Browns Ferry  

SciTech Connect

This paper summarizes an analysis of the movement of noble gases, iodine, and cesium fission products within the Mark-I containment BWR reactor system represented by Browns Ferry Unit 1 during a postulated accident sequence initiated by a loss of decay heat removal (DHR) capability following a scram. The event analysis showed that this accident could be brought under control by various means, but the sequence with no operator action ultimately leads to containment (drywell) failure followed by loss of water from the reactor vessel, core degradation due to overheating, and reactor vessel failure with attendant movement of core debris onto the drywell floor.

Wichner, R.P.; Weber, C.F.; Hodge, S.A.; Beahm, E.C.; Wright, A.L.

1984-01-01

192

Transfinite element methodology towards a unified thermal/structural analysis  

NASA Technical Reports Server (NTRS)

The paper describes computational developments towards thermal/structural modeling and analysis via a generalized common numerical methodology for effectively and efficiently interfacing interdisciplinary areas. The proposed formulations use transform methods in conjunction with finite element developments for each of the heat transfer and structural disciplines, respectively, providing avenues for obtaining the structural response due to thermal effects. An alternative methodology for unified thermal/structural analysis is presented. The potential of the approach is outlined in comparison with conventional schemes and existing practices. Highlights and characteristic features of the approach are described via general formulations and applications to several problems. Results obtained demonstrate excellent agreement in comparison with analytic and/or conventional finite element schemes accurately and efficiently.

Tamma, K. K.; Railkar, S. B.

1986-01-01

193

Content Analysis—A Methodological Primer for Gender Research  

Microsoft Academic Search

This article is intended to serve as a primer on methodological standards for gender scholars pursuing content analytic research.\\u000a The scientific underpinnings of the method are explored, including the roles of theory, past research, population definition,\\u000a objectivity\\/intersubjectivity, reliability, validity, generalizability, and replicability. Both human coding and computer\\u000a coding are considered. The typical process of human-coded content analysis is reviewed, including

Kimberly A. Neuendorf

2011-01-01

194

Methodology study on event-tree analysis for ULOF sequences considering passive safety features.  

National Technical Information Service (NTIS)

In order to establish a method of probabilistic safety analysis for passive safety features, the event-tree (E/T) of ULOF accident sequences in the early stage of accident progression was constructed for an 600 MWe LMFBR model plant equipped with passive ...

T. Mihara H. Niwa

1998-01-01

195

Risk analysis of releases from accidents during mid-loop operation at Surry  

SciTech Connect

Studies and operating experience suggest that the risk of severe accidents during low power operation and/or shutdown (LP/S) conditions could be a significant fraction of the risk at full power operation. Two studies have begun at the Nuclear Regulatory Commission (NRC) to evaluate the severe accident progression from a risk perspective during these conditions: One at the Brookhaven National Laboratory for the Surry plant, a pressurized water reactor (PWR), and the other at the Sandia National Laboratories for the Grand Gulf plant, a boiling water reactor (BWR). Each of the studies consists of three linked, but distinct, components: a Level I probabilistic risk analysis (PRA) of the initiating events, systems analysis, and accident sequences leading to core damage; a Level 2/3 analysis of accident progression, fuel damage, releases, containment performance, source term and consequences-off-site and on-site; and a detailed Human Reliability Analysis (HRA) of actions relevant to plant conditions during LP/S operations. This paper summarizes the approach taken for the Level 2/3 analysis at Surry and provides preliminary results on the risk of releases and consequences for one plant operating state, mid-loop operation, during shutdown.

Jo, J.; Lin, C.C.; Nimnual, S.; Mubayi, V.; Neymotin, L.

1992-11-01

196

Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept  

SciTech Connect

The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

Ricci, E.; McLean, R.B.

1988-09-01

197

Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components  

NASA Technical Reports Server (NTRS)

The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

2000-01-01

198

Development of test methodology for dynamic mechanical analysis instrumentation  

NASA Astrophysics Data System (ADS)

Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

Allen, V. R.

1982-08-01

199

Sensitivity analysis of the rod ejection accident for the Beznau reactor.  

National Technical Information Service (NTIS)

The rod ejection accident (REA) of the Beznau (KKB-2) nuclear power plant was investigated. The REA analysis was performed using the RETRAN-02 computer code. Four basic cases were investigated for the cycle 16 conditions. At the beginning-of-life (BOL) th...

D. Saphier M. A. Zimmermann E. Knoglinger P. Jacquemoud

1990-01-01

200

Vapor explosions: A review of experiments for accident analysis  

SciTech Connect

A vapor explosion is a physical event in which a hot liquid (fuel) transfers its internal energy to a colder, more volatile liquid (coolant); thus the coolant vaporizes at high pressures and expands analyses work on its surroundings. In postulated severe accidents in current fission reactors, vapor explosions are considered if this molten {open_quotes}fuel{close_quotes} contacts residual water in-vessel or ex-vessel because these physical explosions have the potential to contribute to reactor vessel failure and possibly containment failure and release of radioactive fission products. Current safety analyses and probabilistic studies consider this process with the use of explosion models. Eventually these models must be compared with available experimental data to determine their validity. This study provides a comprehensive review of vapor explosion experiments for eventual use in such comparisons. Also, when there are insufficient data, experiments are suggested that can provide the needed information for future comparisons. This view may be useful for light-water-reactor as well as noncommercial reactor safety studies. 115 refs., 6 figs., 3 tabs.

Corradini, M.L.; Taleyarkhan, R.P.

1991-07-01

201

Analysis of fission product release behavior during the TMI-2 accident  

SciTech Connect

An analysis of fission product release during the Three Mile Island Unit 2 (TMI-2) accident has been initiated to provide an understanding of fission product behavior that is consistent with both the best estimate accident scenario and fission product results from the ongoing sample acquisition and examination efforts. ''First principles'' fission product release models are used to describe release from intact, disrupted, and molten fuel. Conclusions relating to fission product release, transport, and chemical form are drawn. 35 refs., 12 figs., 7 tabs.

Petti, D. A.; Adams, J. P.; Anderson, J. L.; Hobbins, R. R.

1987-01-01

202

Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation  

PubMed Central

The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications.

Pazo, Jose A.; Granada, Enrique; Saavedra, Angeles; Eguia, Pablo; Collazo, Joaquin

2010-01-01

203

Safety analysis report for the Galileo Mission. Volume 2, book 2: Accident model document, Appendices  

NASA Astrophysics Data System (ADS)

This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report 2, Volume 2. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

1988-12-01

204

Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices  

SciTech Connect

This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

Not Available

1988-12-15

205

Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing  

NASA Technical Reports Server (NTRS)

Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

2014-01-01

206

Statistics Analysis of death Accident in Coal Mines from January 2005 to June 2009  

Microsoft Academic Search

In this paper, death accident in coal mines were tidied using the method of mathematical statistic from January 2005 to June 2009. By linking aspect such as accident types, time rule of accident happens, provincial statistics, causes of accident, the thesis analyzed the characteristics and rules of coal mine accidents, introduced the particularity and damages of coal mine accidents, the

Liu Xiaoli; Guo Liwen; Zhang Zhiye

2010-01-01

207

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

208

Defect analysis methodology for contact hole grapho epitaxy DSA  

NASA Astrophysics Data System (ADS)

Next-generation lithography technology is required to meet the needs of advanced design nodes. Directed Self Assembly (DSA) is gaining momentum as an alternative or complementary technology to EUV lithography. We investigate defectivity on a 2xnm patterning of contacts for 25nm or less contact hole assembly by grapho epitaxy DSA technology with guide patterns printed using immersion ArF negative tone development. This paper discusses the development of an analysis methodology for DSA with optical wafer inspection, based on defect source identification, sampling and filtering methods supporting process development efficiency of DSA processes and tools.

Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Kawakami, Shinichiro; Yamauchi, Takashi; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kitano, Takahiro

2014-04-01

209

SOCRAT: The System of Codes for Realistic Analysis of Severe Accidents  

SciTech Connect

For a long time in the Russian Federation the computer code for analysis of severe accidents is being developed. The main peculiarity of this code from the known computer codes for analysis of severe accidents at NPP such as MELCOR and ASTEC, is a consequent realization of the mechanistic approach for modeling of the melt progression processes, including beyond design basis accidents with the severe core damage. The motivation of the development is defined by the new design requirements to the safety of nuclear power plants with the improved economic factors, by the modernization of existing NPPs, by the development of instructions to the accident management and emergency planning. The realistic assessments of Nuclear power plants safety require usage of the best estimate codes allowing description of the melt progression processes accompanying severe accident at the nuclear installation and behavior of the containment under abnormal condition (in particular, rates of the steam and hydrogen release, relocation of molten materials to the concrete cavity after failure of the reactor vessel). The developed computer codes were used for the safety justification of NPP with the new generation of VVER type reactor such as Tyanvan NPP in China and Kudamkulam NPP in India. In particular using this code system the justification of the system for hydrogen safety, analysis of core degradation and relocation of the molten core to the core catcher used for the guarantied localization of the melt and prevention of the ex-vessel melt progression. The considered system of codes got recently name SOCRAT provides the self consistent analysis of in-vessel processes and processes, running in the containment, including melt localization device. In the paper the structure of the computer code SOCRAT is presented, functionality of the separate parts of the code is described, results of verification of different models of the code are also considered. (authors)

Bolshov, Leonid; Strizhov, Valery [Nuclear Safety Institute, Russian Academy of Sciences, B. Tulskaya, 52 Moscow, 115191 (Russian Federation)

2006-07-01

210

SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES  

SciTech Connect

Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

Coutts, D

2007-04-17

211

Formal Analysis of an Airplane Accident in N{?}-Labeled Calculus  

NASA Astrophysics Data System (ADS)

N{?}-labeled calculus is a formal system for representation, verification and analysis of time-concerned recognition, knowledge, belief and decision of humans or computer programs together with related external physical or logical phenomena.In this paper, a formal verification and analysis of the JAL near miss accident is presented as an example of cooperating systems controlling continuously changing objects including human factor with misunderstanding or incorrect recognition.

Mizutani, Tetsuya; Igarashi, Shigeru; Ikeda, Yasuwo; Shio, Masayuki

212

SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident  

NASA Astrophysics Data System (ADS)

On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

2014-06-01

213

Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [GRAM, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

214

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. Boardman; J. A. Jones; F. T. Harper; M. L. Young; S. C. Hora

1997-01-01

215

A methodology for reliability analysis in health networks.  

PubMed

A reliability model for a health care domain based on requirement analysis at the early stage of design of regional health network (RHN) is introduced. RHNs are considered as systems supporting the services provided by health units, hospitals, and the regional authority. Reliability assessment in health care domain constitutes a field-of-quality assessment for RHN. A novel approach for predicting system reliability in the early stage of designing RHN systems is presented in this paper. The uppermost scope is to identify the critical processes of an RHN system prior to its implementation. In the methodology, Unified Modeling Language activity diagrams are used to identify megaprocesses at regional level and the customer behavior model graph (CBMG) to describe the states transitions of the processes. CBMG is annotated with: 1) the reliability of each component state and 2) the transition probabilities between states within the scope of the life cycle of the process. A stochastic reliability model (Markov model) is applied to predict the reliability of the business process as well as to identify the critical states and compare them with other processes to reveal the most critical ones. The ultimate benefit of the applied methodology is the design of more reliable components in an RHN system. The innovation of the approach of reliability modeling lies with the analysis of severity classes of failures and the application of stochastic modeling using discrete-time Markov chain in RHNs. PMID:18693505

Spyrou, Stergiani; Bamidis, Panagiotis D; Maglaveras, Nicos; Pangalos, George; Pappas, Costas

2008-05-01

216

Overview of Sandia National Laboratories and Khlopin Radium Institute collaborative radiological accident consequence analysis efforts  

SciTech Connect

In January, 1995 a collaborative effort to improve radiological consequence analysis methods and tools was initiated between the V.G. Khlopin Institute (KRI) and Sandia National Laboratories (SNL). The purpose of the collaborative effort was to transfer SNL`s consequence analysis methods to KRI and identify opportunities for collaborative efforts to solve mutual problems relating to the safety of radiochemical facilities. A second purpose was to improve SNL`s consequence analysis methods by incorporating the radiological accident field experience of KRI scientists (e.g. the Chernobyl and Kyshtym accidents). The initial collaborative effort focused on the identification of: safety criteria that radiochemical facilities in Russia must meet; analyses/measures required to demonstrate that safety criteria have been met; and data required to complete the analyses/measures identified to demonstrate the safety basis of a facility.

Young, M.L.; Carlson, D.D. [Sandia National Labs., Albuquerque, NM (United States); Lazarev, L.N.; Petrov, B.F.; Romanovskiy, V.N. [V.G. Khlopin Radium Inst., St. Petersburg (Russian Federation)

1997-05-01

217

Risk analysis of catastrophes using experts' judgements: An empirical study on risk analysis of major civil aircraft accidents in Europe  

Microsoft Academic Search

We present an empirical study performed on risk analysis of major civil aircraft accidents in Europe by using expert judgements. The main goal is to investigate in practice some theoretical models which have been constructed previously. These models calibrate and aggregate judgements of the experts in order to predict the future risks. The statistical tools which are used in this

Elizabeth Saers Bigün

1995-01-01

218

A flammability and combustion model for integrated accident analysis. [Advanced light water reactors  

SciTech Connect

A model for flammability characteristics and combustion of hydrogen and carbon monoxide mixtures is presented for application to severe accident analysis of Advanced Light Water Reactors (ALWR's). Flammability of general mixtures for thermodynamic conditions anticipated during a severe accident is quantified with a new correlation technique applied to data for several fuel and inertant mixtures and using accepted methods for combining these data. Combustion behavior is quantified by a mechanistic model consisting of a continuity and momentum balance for the burned gases, and considering an uncertainty parameter to match the idealized process to experiment. Benchmarks against experiment demonstrate the validity of this approach for a single recommended value of the flame flux multiplier parameter. The models presented here are equally applicable to analysis of current LWR's. 21 refs., 16 figs., 6 tabs.

Plys, M.G.; Astleford, R.D.; Epstein, M. (Fauske and Associates, Inc., Burr Ridge, IL (USA))

1988-01-01

219

A Methodology to Model Human and Organisational Errors on Offshore Risk Analysis  

Microsoft Academic Search

This paper aims to contribute to offshore risk analysis by proposing a methodology to model causal relationships focusing on human and organisational errors. The proposed methodology uses James Reason's \\

Jun Ren; Jin Wang; Ian Jenkinson; Dongling Xu; Jianbo Yang

2006-01-01

220

Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.  

PubMed

Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions as well as the causes of non-compliance with SMS. PMID:23764875

Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

2013-10-01

221

ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS  

Microsoft Academic Search

This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94,

2003-01-01

222

Analysis of loss-of-coolant accidents in the High-Flux Isotope Reactor  

Microsoft Academic Search

Loss-of-coolant accident analyses have been completed for the High-Flux Isotope Reactor safety analysis report. More than 100 simulations have been performed using the RELAP5\\/MOD2.5 computer program. The RELAP5 input model used for the simulations is quite detailed, including 17 parallel channels in the core region, the three active heat exchanger cells, the pressurizing system, and the secondary cooling system. Special

M. W. Wendel; D. G. Morris; P. T. Williams

1996-01-01

223

Forensic mtDNA hair analysis excludes a dog from having caused a traffic accident  

Microsoft Academic Search

A dog was suspected of having caused a traffic accident. Three hair fragments were recovered from the damaged car and subjected\\u000a to DNA sequence analysis of the canine mitochondrial D-loop control region. The results were compared to saliva and hair samples\\u000a from the alleged dog, as well as to control hair samples from four unrelated dogs of different breeds. Two

P. M. Schneider; Y. Seo; C. Rittner

1999-01-01

224

Uncertainty Analysis of Accident Notification Time and Emergency Medical Service Response Time in Work Zone Traffic Accidents  

Microsoft Academic Search

Taking into account the uncertainty caused by exogenous factors, the accident notification time (ANT) and emergency medical service (EMS) response time are modeled as two random variables following the lognormal distribution. Their mean values and standard deviations are respectively formulated as the functions of environmental variables including crash time, road type, weekend, holiday, light condition, weather and work zone type.

Qiang Meng; Jinxian Weng

2012-01-01

225

Human reliability data, human error and accident models—illustration through the Three Mile Island accident analysis  

Microsoft Academic Search

Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this

Pierre Le Bot

2004-01-01

226

Framatome-ANP France UO{sub 2} fuel fabrication - criticality safety analysis in the light of the 1999' Tokay Mura accident  

SciTech Connect

In France the 1999' Tokai Mura criticality accident in Japan had a big impact on the nuclear fuel manufacturing facility community. Moreover this accident led to a large public discussion about all the nuclear facilities. The French Safety Authorities made strong requirements to the industrials to revisit completely their safety analysis files mainly those concerning nuclear fuels treatments. The Framatome-ANP production of its French low enriched (5 w/o) UO{sub 2} fuel fabrication plant (FBFC/Romans) exceeds 1000 metric tons a year. Special attention was given to the emergency evacuation plan that should be followed in case of a criticality accident. If a criticality accident happens, site internal and external radioprotection requirements need to have an emergency evacuation plan showing the different routes where the absorbed doses will be as low as possible for people. The French Safety Authorities require also an update of the old based neutron source term accounting for state of the art methodology. UO{sub 2} blenders units contain a large amount of dry powder strictly controlled by moderation; a hypothetical water leakage inside one of these apparatus is simulated by increasing the water content of the powder. The resulted reactivity insertion is performed by several static calculations. The French IRSN/CEA CRISTAL codes are used to perform these static calculations. The kinetic criticality code POWDER simulates the power excursion versus time and determines the consequent total energy source term. MNCP4B performs the source term propagation (including neutrons and gamma) used to determine the isodose curves needed to define the emergency evacuation plant. This paper deals with the approach Framatome-ANP has taken to assess Safety Authorities demands using the more up to date calculation tools and methodology. (authors)

Doucet, M.; Zheng, S. [Framatome-ANP Fuel Technology Service (France); Mouton, J.; Porte, R. [Framatome-ANP Fuel Fabrication Plant - FBFC (France)

2004-07-01

227

Development and application of proton NMR methodology to lipoprotein analysis  

NASA Astrophysics Data System (ADS)

The present thesis describes the development of 1H NMR spectroscopy and its applications to lipoprotein analysis in vitro, utilizing biochemical prior knowledge and advanced lineshape fitting analysis in the frequency domain. A method for absolute quantification of lipoprotein lipids and proteins directly from the terminal methyl-CH3 resonance region of 1H NMR spectra of human blood plasma is described. Then the use of NMR methodology in time course studies of the oxidation process of LDL particles is presented. The function of the cholesteryl ester transfer protein (CETP) in lipoprotein mixtures was also assessed by 1H NMR, which allows for dynamic follow-up of the lipid transfer reactions between VLDL, LDL, and HDL particles. The results corroborated the suggestion that neutral lipid mass transfer among lipoproteins is not an equimolar heteroexchange. A novel method for studying lipoprotein particle fusion is also demonstrated. It is shown that the progression of proteolytically (?- chymotrypsin) induced fusion of LDL particles can be followed by 1H NMR spectroscopy and, moreover, that fusion can be distinguished from aggregation. In addition, NMR methodology was used to study the changes in HDL3 particles induced by phospholipid transfer protein (PLTP) in HDL3 + PLTP mixtures. The 1H NMR study revealed a gradual production of enlarged HDL particles, which demonstrated that PLTP-mediated remodeling of HDL involves fusion of the HDL particles. These applications demonstrated that the 1H NMR approach offers several advantages both in quantification and in time course studies of lipoprotein-lipoprotein interactions and of enzyme/lipid transfer protein function.

Korhonen, Ari Juhani

1998-11-01

228

Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D. [GRAM, Inc., Albuquerque, NM (United States); McKay, M.D. [Los Alamos National Lab., NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

229

Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [Gram, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

230

Space Shuttle Columbia Post-Accident Analysis and Investigation  

NASA Technical Reports Server (NTRS)

Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

McDanels, Steven J.

2006-01-01

231

Disassembly methodology for conducting failure analysis on lithium–ion batteries  

Microsoft Academic Search

To facilitate construction analysis, failure analysis, and research in lithium–ion battery technology, a high quality methodology\\u000a for battery disassembly is needed. This paper presents a methodology for battery disassembly that considers key factors based\\u000a on the nature and purpose of post-disassembly analysis. The methodology involves upfront consideration of analysis paths that\\u000a will be conducted on the exposed internal components to

Nick Williard; Bhanu Sood; Michael Osterman; Michael Pecht

232

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

233

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1998-04-01

234

Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents  

NASA Astrophysics Data System (ADS)

In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

Minato, Kazuo

1991-11-01

235

Human Engineering Analysis of Real World Industrial Accidents: Using Plant-Specific Data to Understand Cultural Aspects of Accidents  

Microsoft Academic Search

\\u000a Trends in industrial accident reduction in the USA include building safer and more accountable work cultures. These trends\\u000a are reflected in the Occupational Safety and Health Administration (OSHA) Safety Health Achievement Recognition Program (SHARP).\\u000a SHARP promotes cultural concepts such as leadership and employee participation. Evaluating the effects of a management system\\u000a such as SHARP can be difficult for small industrial

R. Conway Underwood; Sri Kumar; Melissa A. Pethel; Glen C. Rains; Paul A. Schlumper; Daniel Strickland

236

A comparison of the whole-body and segmental methodologies of bioimpedance analysis  

Microsoft Academic Search

Theory supports the use of a segmental methodology (SM) for bioimpedance analysis (BIA) of body water (BW). However, previous studies have generally failed to show a significant improvement when the SM is used in place of a whole-body methodology. A pilot study was conducted to compare the two methodologies in control and overweight subjects. BW of each subject was measured

B. J. Thomas; B. H. Cornish; M. J. Pattemore; M. Jacobs; L. C. Ward

2003-01-01

237

Risk assessment for the Waste Technologies Industries (WTI) hazardous waste incinerator facility (east Liverpool, Ohio). Volume 7. Accident analysis: Selection and assessment of potential release scenarios. Draft report  

SciTech Connect

This report constitutes a comprehensive site-specific risk assessment for the WTI incineration facility located in East Liverpool, OH. The Accident Analysis is an evaluation of the likelihood of occurrence and resulting consequences from several general classes of accidents that could potentially occur during operation of the facility. The Accident Analysis also evaluates the effectiveness of existing mitigation measures in reducing off-site impacts. Volume VII describes in detail the methods used to conduct the Accident Analysis and reports the results of evaluations of likelihood and consequence for the selected accident scenarios.

NONE

1995-11-01

238

Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)  

SciTech Connect

This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

Whitehead, D. [Sandia National Labs., Albuquerque, NM (United States); Darby, J. [Science and Engineering Associates, Inc., Albuquerque, NM (United States); Yakle, J. [Science Applications International Corp., Albuquerque, NM (United States)] [and others

1994-06-01

239

SAS4A: A computer model for the analysis of hypothetical core disruptive accidents in liquid metal reactors  

SciTech Connect

To ensure that the public health and safety are protected under any accident conditions in a Liquid Metal Fast Breeder Reactor (LMFBR), many accidents are analyzed for their potential consequences. The SAS4A code system, described in this paper, provides such an analysis capability, including the ability to analyze low probability events such as the Hypothetical Core Disruptive Accidents (HCDAs). The SAS4A code system has been designed to simulate all the events that occur in a LMFBR core during the initiating phase of a Hypothetical Core Disruptive Accident. During such postulated accident scenarios as the Loss-of-Flow and Transient Overpower events, a large number of interrelated physical phenomena occur during a relatively short time. These phenomena include transient heat transfer and hydrodynamic events, coolant boiling and fuel and cladding melting and relocation. During to the strong neutronic feedback present in a nuclear reactor, these events can significantly influence the reactor power. The SAS4A code system is used in the safety analysis of nuclear reactors, in order to estimate the energetic potential of very low probability accidents. The results of SAS4A simulations are also used by reactor designers in order to build safer reactors and eliminate the possibility of any accident which could endanger the public safety.

Tentner, A.M.; Birgersson, G.; Cahalan, J.E.; Dunn, F.E.; Kalimullah; Miles, K.J.

1987-01-01

240

Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool  

SciTech Connect

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR.

Madni, I.K. [Brookhaven National Lab., Upton, NY (United States); Eltawila, F. [Nuclear Regulatory Commission, Washington, DC (United States)

1994-01-01

241

Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool  

SciTech Connect

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor (LWR) nuclear power plants and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories. Brookhaven National Laboratory (BNL) has a program with the NRC called MELCOR Verification, Benchmarking, and Applications, the aim of which is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both boiling water reactors and pressurized water reactors. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). A summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR is presented.

Madni, I.K. [Brookhaven National Lab., Upton, NY (United States)

1995-11-01

242

Verification of fire and explosion accident analysis codes (facility design and preliminary results)  

SciTech Connect

For several years, the US Nuclear Regulatory Commission has sponsored the development of methods for improving capabilities to analyze the effects of postulated accidents in nuclear facilities; the accidents of interest are those that could occur during nuclear materials handling. At the Los Alamos National Laboratory, this program has resulted in three computer codes: FIRAC, EXPAC, and TORAC. These codes are designed to predict the effects of fires, explosions, and tornadoes in nuclear facilities. Particular emphasis is placed on the movement of airborne radioactive material through the gaseous effluent treatment system of a nuclear installation. The design, construction, and calibration of an experimental ventilation system to verify the fire and explosion accident analysis codes are described. The facility features a large industrial heater and several aerosol smoke generators that are used to simulate fires. Both injected thermal energy and aerosol mass can be controlled using this equipment. Explosions are simulated with H/sub 2//O/sub 2/ balloons and small explosive charges. Experimental measurements of temperature, energy, aerosol release rates, smoke concentration, and mass accumulation on HEPA filters can be made. Volumetric flow rate and differential pressures also are monitored. The initial experiments involve varying parameters such as thermal and aerosol rate and ventilation flow rate. FIRAC prediction results are presented. 10 figs.

Gregory, W.S.; Nichols, B.D.; Talbott, D.V.; Smith, P.R.; Fenton, D.L.

1985-01-01

243

A comparative analysis of methodologies for database schema integration  

Microsoft Academic Search

One of the fundamental principles of the database approach is that a database allows a nonredundant, unified representation of all data managed in an organization. This is achieved only when methodologies are available to support integration across organizational and application boundaries. Methodologies for database design usually perform the design activity by separately producing several schemas, representing parts of the application,

Carlo Batini; Maurizio Lenzerini; Shamkant B. Navathe

1986-01-01

244

Process improvement methodology based on multivariate statistical analysis methods  

Microsoft Academic Search

A systematic procedure for process improvement methodology is proposed based on multivariate statistical process control methods. To take advantage of a large amount of historical data, the procedure employs a combination of hierarchical clustering method and statistical process control methods to detect and analyze the key factors that significantly affect the performance of processes. This methodology consists of four sequential

Young-Hak Lee; Kwang Gi Min; Chonghun Han; Kun Soo Chang; Tae Hwa Choi

2004-01-01

245

Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)  

SciTech Connect

The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

Johnson, E.W.

1988-10-01

246

Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design  

NASA Astrophysics Data System (ADS)

Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

2001-05-01

247

Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design  

SciTech Connect

Previous studies of the safety and environmental (S and E) aspects of the HYLIFE-II inertial fusion energy (IFE) power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work a set of computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) has been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here the authors consider a severe lost of coolant accident (LOCA) producing simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the containment) and of the two barriers surrounding the chamber (inner shielding and containment building it self). Even though containment failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product release and transport. The results of these calculations show that the estimated off-site dose is less than 6 mSv (0.6 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

Reyes, S; Gomez del Rio, J; Sanz, J

2000-02-23

248

Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications  

SciTech Connect

Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

VINCENT, ANDREW

2005-04-25

249

Hazardous materials transportation: a risk-analysis-based routing methodology.  

PubMed

This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed. PMID:10677666

Leonelli, P; Bonvicini, S; Spadoni, G

2000-01-01

250

The methodology of multi-viewpoint clustering analysis  

NASA Technical Reports Server (NTRS)

One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

Mehrotra, Mala; Wild, Chris

1993-01-01

251

ADHD and relative risk of accidents in road traffic: a meta-analysis.  

PubMed

The present meta-analysis is based on 16 studies comprising 32 results. These studies provide sufficient data to estimate relative accident risks of drivers with ADHD. The overall estimate of relative risk for drivers with ADHD is 1.36 (95% CI: 1.18; 1.57) without control for exposure, 1.29 (1.12; 1.49) when correcting for publication bias, and 1.23 (1.04; 1.46) when controlling for exposure. A relative risk (RR) of 1.23 is exactly the same as found for drivers with cardiovascular diseases. The long-lasting assertion that "ADHD-drivers have an almost fourfold risk of accident compared to non-ADHD-drivers", which originated from Barkley et al.'s study of 1993, is rebutted. That estimate was associated with comorbid Oppositional Defiant Disorder (ODD) and/or Conduct Disorder (CD), not with ADHD, but the assertion has incorrectly been maintained for two decades. The present study provides some support for the hypothesis that the relative accident risk of ADHD-drivers with comorbid ODD, CD and/or other conduct problems, is higher than that of ADHD-drivers without these comorbidities. The estimated RRs were 1.86 (1.27; 2.75) in a sample of ADHD-drivers in which a majority had comorbid ODD and/or CD compared to 1.31 (0.96; 1.81) in a sample of ADHD-drivers with no comorbidity. Given that ADHD-drivers most often seem to drive more than controls, and the fact that a majority of the present studies lack information about exposure, it seems more probable that the true RR is lower rather than higher than 1.23. Also the assertion that ADHD-drivers violate traffic laws more often than other drivers should be modified: ADHD-drivers do have more speeding violations, but no more drunk or reckless driving citations than drivers without ADHD. All accident studies included in the meta-analysis fail to acknowledge the distinction between deliberate violations and driving errors. The former are known to be associated with accidents, the latter are not. A hypothesis that ADHD-drivers speed more frequently than controls because it stimulates attention and reaction time is suggested. PMID:24238842

Vaa, Truls

2014-01-01

252

The Analysis of PWR SBO Accident with RELAP5 Based on Linux  

NASA Astrophysics Data System (ADS)

RELAP5 is a relatively advanced light water reactor transient hydraulic and thermal analysis code, and it owns the signality of the safe-operating of nuclear reactor system when the safety analysis and operating simulation of the system was done with RELAP5. The RELAP5 operating mode based on Linux operating system was presented in this paper, utilizing Linux operating system's powerful document processing capabilities to deal with the output file of the RELAP5 for the valid data directly, and taking advantage of the system's programmable capabilities to improve the drawing functions of RELAP5. After the operating in Linux system, the precision of the calculating results is guaranteed and the period of the computing is shortened. During the work, for PWR Station Blackout (SBO) accident, the computing with RELAP5 based on Linux and Windows was respectively made. Through the comparison and analysis of the accident response curve of the main parameters such as power of nuclear reactor, average temperature and pressure of primary loop, it shows the operating analysis of nuclear reactor system is safe and reliable with RELAP5 based on Linux.

Xia, Zhimin; Zhang, Dafa

253

A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?  

NASA Technical Reports Server (NTRS)

In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

Holloway, C. M.; Johnson, C. W.

2007-01-01

254

Iterative Transport-Diffusion Methodology For LWR Core Analysis  

NASA Astrophysics Data System (ADS)

This paper presents an update on the development of an advanced methodology for core calculations that uses local heterogeneous solutions for on-the-fly nodal cross-section generation. The Iterative Transport-Diffusion Method is an embedded transport approach that is expected to provide results with near 3D transport accuracy for a fraction of the time required by a full 3D transport method. In this methodology, the infinite environment used for homogenized nodal cross-section generation is replaced with a simulated 3D environment of the diffusion calculation. This update focuses on burnup methodology, axial leakage and 3D modeling.

Colameco, David; Ivanov, Boyan D.; Beacon, Daniel; Ivanov, Kostadin N.

2014-06-01

255

An analysis of the risk in the French sea fishing industry. Example of the dockside accident risk.  

PubMed

In many maritime countries, the work of sea fishermen is one of the most hazardous of occupations. The number of accidents is much greater than in other occupations on land or at sea, and the accidents themselves are often more serious. When considering the risks and hazards of fishing, one initially thinks of major risks like collisions or vessels running aground, as well as the work related injuries which are mainly caused by the fishing equipment (otter boards, ropes) and by the motions of the vessel. These accidents and, in a more general sense, the dangers met by fishermen at sea have already been studied. But little research has been undertaken on the problem of accidents of fishermen while the vessel is in port; and in France, these accidents account for about 30 % of all registered injuries for the sea fishing industry. The presented report takes a look at this category of accidents, on the basis of data on 5074 accidents registered between 1996 and 2005. An examination of statistics therefore points to certain types of risks and dangerous situations, but it also leaves a number of questions pending. One of these is the number of "unknown" causes. Falls, in particular, are usually linked to an outside factor which is not listed on the form the sailors must fill out. To compensate for the limitations of the epidemiological analysis, on-site observation seemed to be the best way of understanding the risks of the activities of fishermen in the port. PMID:17312699

Le bouar, Gilbert; Chauvin, Christine

2006-01-01

256

Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident  

SciTech Connect

The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

Pasichnyk, I.; Perin, Y.; Velkov, K. [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)] [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)

2013-07-01

257

Worldwide Husbanding Process Improvement: Comparative Analysis of Contracting Methodologies.  

National Technical Information Service (NTIS)

This study is designed to support one of three major focus areas in the Naval Supply Systems Command (NAVSUP) Worldwide Husbanding Improvement Process initiative. Existing contracting methodologies were analyzed using the following methods: characteristic...

J. Pitel M. Gundemir P. Metzger R. Manalang

2007-01-01

258

Analysis of Maximum Reasonably Foreseeable Accidents for the Yucca Mountain Draft Environmental Impact Statement (DEIS)  

SciTech Connect

Accidents could occur during the transportation of spent nuclear fuel and high-level radioactive waste. This paper describes the risks and consequences to the public from accidents that are highly unlikely but that could have severe consequences. The impact of these accidents would include those to a collective population and to hypothetical maximally exposed individuals (MEIs). This document discusses accidents with conditions that have a chance of occurring more often than 1 in 10 million times in a year, called ''maximum reasonably foreseeable accidents''. Accidents and conditions less likely than this are not considered to be reasonably foreseeable.

S.B. Ross; R.E. Best; S.J. Maheras; T.I. McSweeney

2001-08-17

259

NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug  

NASA Technical Reports Server (NTRS)

A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

2005-01-01

260

Biomechanical analysis of protective countermeasures in underride motor vehicle accidents - biomed 2009.  

PubMed

Traffic safety has been significantly improved over the past several decades reducing injury and fatality rates. However, there is a paucity of research effort to address the safety issues in underride accidents, specifically the side underride crashes. It is well known that the compromise of occupant space in the vehicle leads to a higher probability of serious or fatal injuries. A better understanding of occupant protection and mechanism of injuries involved in side underride accidents assists in the advancement of safety measures. The present work evaluates the injury potential to occupants during side underride crashes using the car-to-trailer crash methodology. Four crash tests were conducted into the side of a stationary trailer fitted with the side underride guard system (SURG). The SURG used in these tests is 25% lighter than the previous design. A 5th percentile hybrid III female dummy was placed in the driver seat and restrained with the three-point lap and shoulder harness. The anthropometric dummy was instrumented with a head triaxial accelerometer, a chest triaxal accelerometer, a load cell to measure neck force and moment, and a load cell to measure the femur force. The vehicle acceleration was measured using a traxial accelerometer in the rear center tunnel. High speed, standard video and still photos were taken. In all tests, the intrusion was limited to the front structure of the vehicle without any significant compromise to the occupant space. Results indicate that the resultant head and chest accelerations, head injury criterion (HIC), neck force and moment, and femur force were well below the injury tolerance. The present findings support the hypothesis that the SURG not only limits or eliminates the intrusion into the occupant space but also results in biomechanical injury values well below the tolerance limit in motor vehicle crashes. PMID:19369745

Kumar, Sri; Enz, Bruce; Ponder, Perry L; Anderson, Bob

2009-01-01

261

Fire accident analysis modeling in support of non-reactor nuclear facilities at Sandia National Laboratories  

SciTech Connect

The Department of Energy (DOE) requires that fire hazard analyses (FHAs) be conducted for all nuclear and new facilities, with results for the latter incorporated into Title I design. For those facilities requiring safety analysis documentation, the FHA shall be documented in the Safety Analysis Reports (SARs). This paper provides an overview of the methodologies and codes being used to support FHAs at Sandia facilities, with emphasis on SARs.

Restrepo, L.F.

1993-06-01

262

MELCOR Analysis of Steam Generator Tube Creep Rupture in Station Blackout Severe Accident  

SciTech Connect

A pressurized water reactor steam generator tube rupture (SGTR) is of concern because it represents a bypass of the containment for radioactive materials to the environment. In a station blackout accident, tube integrity could be threatened by creep rupture, particularly if cracks are present in the tube walls. Methods are developed herein to improve assessment capabilities for SGTR by using the severe-accident code MELCOR. Best-estimate assumptions based on recent research and computational fluid dynamics calculations are applied in the MELCOR analysis to simulate two-dimensional natural circulation and to determine the relative creep-rupture timing in the reactor coolant pressure boundary components. A new method is developed to estimate the steam generator (SG) hottest tube wall temperature and the tube critical crack size for the SG tubes to fail first. The critical crack size for SG tubes to fail first is estimated to be 20% of the wall thickness larger than by a previous analysis. Sensitivity studies show that the failure sequence would change if some assumptions are modified. In particular, the uncertainty in the countercurrent flow limit model could reverse the failure sequence of the SG tubes and surge line.

Liao, Y.; Vierow, K. [Purdue University (United States)

2005-12-15

263

Safety analysis report for the Galileo Mission. Volume 2, book 1: Accident model document  

NASA Astrophysics Data System (ADS)

The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence.

1988-12-01

264

GO methodology. Volume 2. Application and comparison of the GO methodology and fault-tree analysis  

Microsoft Academic Search

This report compares two methods for the probabilistic analysis of nuclear safety and system availability, namely the GO and fault tree analysis methods. Each method includes a unique system modeling approach as well as a system of computer codes to quantify the model, determine minimal cutsets, and rank the relative importance of the modeled events. To provide a basis for

A. P. Jr. Kelley; D. W. Stillwell

1983-01-01

265

Accident investigation  

NASA Technical Reports Server (NTRS)

Aircraft accident investigations are discussed with emphasis on those accidents that involved weather as a contributing factor. The organization of the accident investigation board for air carrier accidents is described along with the hearings, and formal report preparation. Statistical summaries of the investigations of general aviation accidents are provided.

Brunstein, A. I.

1979-01-01

266

Natural phenomena risk analysis - an approach for the tritium facilities 5480.23 SAR natural phenomena hazards accident analysis  

SciTech Connect

A Tritium Facilities (TF) Safety Analysis Report (SAR) has been developed which is compliant with DOE Order 5480.23. The 5480.23 SAR upgrades and integrates the safety documentation for the TF into a single SAR for all of the tritium processing buildings. As part of the TF SAR effort, natural phenomena hazards (NPH) were analyzed. A cost effective strategy was developed using a team approach to take advantage of limited resources and budgets. During development of the Hazard and Accident Analysis for the 5480.23 SAR, a strategy was required to allow maximum use of existing analysis and to develop a cost effective graded approach for any new analysis in identifying and analyzing the bounding accidents for the TF. This approach was used to effectively identify and analyze NPH for the TF. The first part of the strategy consisted of evaluating the current SAR for the RTF to determine what NPH analysis could be used in the new combined 5480.23 SAR. The second part was to develop a method for identifying and analyzing NPH events for the older facilities which took advantage of engineering judgment, was cost effective, and followed a graded approach. The second part was especially challenging because of the lack of documented existing analysis considered adequate for the 5480.23 SAR and a limited budget for SAR development and preparation. This paper addresses the strategy for the older facilities.

Cappucci, A.J. Jr.; Joshi, J.R.; Long, T.A.; Taylor, R.P.

1997-07-01

267

Hospital multifactor productivity: a presentation and analysis of two methodologies.  

PubMed

In response to recent discussions regarding the ability of hospitals to achieve gains in productivity, we present two methodologies that attempt to measure multifactor productivity (MFP) in the hospital sector. We analyze each method and conclude that the inconsistencies in their outcomes make it difficult to estimate a precise level of MFP that hospitals have historically achieved. Our goal in developing two methodologies is to inform the debate surrounding the ability of hospitals to achieve gains in MFP, as well as to highlight some of the challenges that exist in measuring hospital MFP. PMID:18435223

Cylus, Jonathan D; Dickensheets, Bridget A

268

Analysis of fatal road traffic accidents in a coastal township of South India.  

PubMed

Road traffic accidents (RTAs) are important causes of mortality and morbidity due to the increasing number of vehicles, changes in lifestyle and the risk behaviours among general population. With the aim of exploring various epidemiological characteristics of RTAs, this retrospective analysis of medico-legal autopsies was conducted between January 2005 and December 2009 in the Department of Forensic Medicine, Kasturba Medical College, Manipal in Karnataka, South India. The information was collected from post-mortem registers and inquest documents received from the investigating police officers. The collected information was analysed using SPSS version 11.0. Out of the 879 autopsies conducted during the study period, 39% were due to RTAs. Among the victims, 89.8% were males and 10.2% were females. The mean age of victims was 38.7 years, which was slightly higher in females compared to males. Most of the male victims belonged to the age group 20-29 years. The head injuries were responsible for nearly 3/4th of deaths followed by abdominal injuries (6.7%). The mean duration of survival following road traffic accident was 6-7 days. Occupants of motorized two wheelers (43%) and pedestrians (33%) were the most common victims of RTAs followed by occupants of light motor vehicles (LMVs). The most common offending agents in road traffic accidents were heavy motor vehicles (35.2%) followed by light motor vehicles (31.7%). In view of the above finding, it is apt to conclude that RTAs are important public health hazards and should be addressed through strengthening of emergency healthcare, stricter enforcement of traffic laws and health education. PMID:23084306

Kanchan, Tanuj; Kulkarni, Vaman; Bakkannavar, Shankar M; Kumar, Nithin; Unnikrishnan, B

2012-11-01

269

Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis  

Microsoft Academic Search

A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected.

M. K. Goldhaber; S. L. Staub; G. K. Tokuhata

1983-01-01

270

A Content Analysis of News Media Coverage of the Accident at Three Mile Island.  

ERIC Educational Resources Information Center

A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

Stephens, Mitchell; Edison, Nadyne G.

271

A Comparative Analysis of Accident Risks in Fossil, Hydro, and Nuclear Energy Chains  

Microsoft Academic Search

This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG [Liquefied Petroleum Gas]) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents

Peter Burgherr; Stefan Hirschberg

2008-01-01

272

Does Music Induce Emotion? A Theoretical and Methodological Analysis  

Microsoft Academic Search

Is music ubiquitous in part because it is causally linked to emotion? In this article, a comprehensive theoretical and methodological reevaluation is presented of a classical problem: The direct induction of emotion by music (M? E). The author's Prototypical Emotion-Episode Model (PEEM) is used in the conceptual critique. A close scrutiny of the major published studies, and the author's new

Vladimir J. Kone?ni

2008-01-01

273

Methodology for Using Nonlinear Aerodynamics in Aeroservoelastic Analysis and Design.  

National Technical Information Service (NTIS)

A methodology is presented for using the Volterra-Wiener theory of nonlinear systems in aeroservoelastic (ASE) analyses and design. The theory is applied to the development of nonlinear aerodynamic response models that can be defined in state-space form a...

W. A. Silva

1991-01-01

274

A Methodology for Architecture-Level Reliability Risk Analysis  

Microsoft Academic Search

Abstract?isk assessment is an essential process of every software risk management plan. Several risk assessment techniques are based on the subjective judgement of domain experts. Subjective risk assessment techniques are human intensive and error-prone. Risk assessment should be based on product attributes that we can quantitatively measure using product metrics. This paper presents a methodology for reliability risk assessment at

Sherif M. Yacoub; Hany H. Ammar

2002-01-01

275

Interferometric data analysis based on Markov nonlinear filtering methodology  

Microsoft Academic Search

For data processing in conventional phase shifting interferometry, Fourier transform, and least-squares-fitting techniques, a whole interferometric data series is required. We propose a new interferometric data processing methodology based on a recurrent nonlinear procedure. The signal value is predicted from the previous step to the next step, and the prediction error is used for nonlinear correction of an a priori

Igor P. Gurov; Denis V. Sheynihovich

2000-01-01

276

Analysis of Kuosheng Large-Break Loss-of-Coolant Accident with MELCOR 1.8.4  

SciTech Connect

The MELCOR code, developed by Sandia National Laboratories, is capable of simulating the severe accident phenomena of light water reactor nuclear power plants (NPPs). A specific large-break loss-of-coolant accident (LOCA) for Kuosheng NPP is simulated with the use of the MELCOR 1.8.4 code. This accident is induced by a double-ended guillotine break of one of the recirculation pipes concurrent with complete failure of the emergency core cooling system. The MELCOR input deck for the Kuosheng NPP is established based on the design data of the Kuosheng NPP and the MELCOR users' guides. The initial steady-state conditions are generated with a developed self-initialization algorithm. The effect of the MELCOR 1.8.4-provided initialization process is demonstrated. The main severe accident phenomena and the corresponding fission product released fractions associated with the large-break LOCA sequences are simulated. The MELCOR 1.8.4 predicts a longer time interval between the core collapse and vessel failure and a higher source term. This MELCOR 1.8.4 input deck will be applied to the probabilistic risk assessment, the severe accident analysis, and the severe accident management study of the Kuosheng NPP in the near future.

Wang, T.-C.; Wang, S.-J.; Chien, C.-S

2000-09-15

277

Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report  

SciTech Connect

The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

1986-09-01

278

Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors  

SciTech Connect

The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

Pate-Cornell, M.E. (Stanford Univ., CA (United States))

1993-04-01

279

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

280

Analysis of accidents during the mid-loop operating state at a PWR  

SciTech Connect

Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

1992-12-31

281

Analysis of accidents during the mid-loop operating state at a PWR  

SciTech Connect

Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

1992-01-01

282

DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS  

SciTech Connect

Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

Wu, T

2008-04-30

283

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

1997-12-01

284

RELAP5\\/MOD3.3 analysis of coolant depletion tests after safety injection failure during a large-break loss-of-coolant accident  

Microsoft Academic Search

This study is concerned with development of a coupled calculation methodology with which to continually and consistently analyze progression of an accident from the design-basis phase via core uncovery to core melting and relocation. Experiments were performed to investigate the core coolant inventory depletion after safety injection failure during a large-break loss-of-coolant accident in a cold leg utilizing the Seoul

Yong-Soo Kim; Byoung-Uhn Bae; Chang-Hwan Park; Goon-Cherl Park; Kune-Yull Suh; Un-Chul Lee

2005-01-01

285

PWR loss-of-coolant accident analysis capability of the WRAP-EM system  

SciTech Connect

The modular computational system known as the Water Reactor Analysis Package (WRAP) has been extended to provide the computational tools required to perform a complete analysis of loss-of-coolant accidents (LOCAs) in pressurized water reactors (PWR). The new system is known as the WRAP-EM (Evaluation Model) system and will be used by NRC to interpret and evaluate reactor vendor EM methods and computed results. The system for PWR-EM analysis is comprised of several computer codes which have been developed to analyze a particular phase of a LOCA. These codes include GAPCON for calculation of initial fuel conditions, WRAP (the previously developd SRL analog of RELAP4/MOD5) for analysis of the system blowdown and refill, the FLOOD option in WRAP for analysis of the reflood phase, and FRAP for the calculation of the behavior of the hot fuel pin. In addition, a PWR steady-state initialization procedure has been developed to provide the initial operating state of the reactor system. The PWR-EM system is operational and is being evaluated to determine the adequacy and consistency of the physical models employed for EM analysis.

Gregory, M.V.; Beranek, F.

1980-08-01

286

Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis  

ERIC Educational Resources Information Center

This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

Kover, Sara T.; Atwood, Amy K.

2013-01-01

287

Calculational methodology and associated uncertainties: Sensitivity and uncertainty analysis of reactor performance parameters  

Microsoft Academic Search

This chapter considers the calculational methodology and associated uncertainties both for the design of large LMFBR's and the analysis of critical assemblies (fast critical experiments) as performed by several groups within the US. Discusses cross-section processing; calculational methodology for the design problem; core physics computations; design-oriented approximations; benchmark analyses; and determination of calculational corrections and associated uncertainties for a critical

E. Kujawski; C. R. Weisbin

1982-01-01

288

The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents  

NASA Technical Reports Server (NTRS)

In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

Ancel, Ersin; Shih, Ann T.

2012-01-01

289

Analysis of Radionuclide Releases from the Fukushima Dai-Ichi Nuclear Power Plant Accident Part I  

NASA Astrophysics Data System (ADS)

Part I of this publication deals with the analysis of fission product releases consecutive to the Fukushima Dai-ichi accident. Reactor core damages are assessed relying on radionuclide detections performed by the CTBTO radionuclide network, especially at the particulate station located at Takasaki, 210 km away from the nuclear power plant. On the basis of a comparison between the reactor core inventory at the time of reactor shutdowns and the fission product activities measured in air at Takasaki, especially 95Nb and 103Ru, it was possible to show that the reactor cores were exposed to high temperature for a prolonged time. This diagnosis was confirmed by the presence of 113Sn in air at Takasaki. The 133Xe assessed release at the time of reactor shutdown (8 × 1018 Bq) turned out to be in the order of 80 % of the amount deduced from the reactor core inventories. This strongly suggests a broad meltdown of reactor cores.

Le Petit, G.; Douysset, G.; Ducros, G.; Gross, P.; Achim, P.; Monfort, M.; Raymond, P.; Pontillon, Y.; Jutier, C.; Blanchard, X.; Taffary, T.; Moulin, C.

2014-03-01

290

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)] [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States)] [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands)] [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany)] [Research Center, Karlsruhe (Germany); Helton, J.C. [and others] [and others

1995-01-01

291

An approach to accidents modeling based on compounds road environments.  

PubMed

The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. PMID:23376544

Fernandes, Ana; Neves, Jose

2013-04-01

292

Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H  

SciTech Connect

'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

Blanchard, A.

1999-05-10

293

14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures  

Code of Federal Regulations, 2012 CFR

This appendix provides methodologies for performing toxic release hazard analysis for the flight of a launch vehicle as required by § 417.229 and for launch processing at a launch site in the United States as required by §...

2014-01-01

294

Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology  

SciTech Connect

This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

J. Scaglione

1999-09-09

295

Methodology for the Analysis of Investment Alternatives to Stimulate Development and Technology Transfer for Energy Technologies.  

National Technical Information Service (NTIS)

A methodology is presented for the analysis of incentives that could be offered by the Government to encourage the development and commercialization of selected energy technologies. A decision-oriented approach has been adopted in characterizing a typical...

1978-01-01

296

Verification of the three-dimensional thermal-hydraulic models of the TRAC accident-analysis code. [PWR  

Microsoft Academic Search

The Transient Reactor Analysis Code (TRAC) being developed at Los Alamos National Laboratory provides a best-estimate prediction of the response of light water reactors or test facilities to postulated accident sequences. One of the features of the code is the ability to analyze the vessel and its heated core in three dimensions. The code is being used to analyze the

Motley

1982-01-01

297

A DIGITAL COMPUTER ANALYSIS OF LOSS-OF-COOLANT ACCIDENT FOR A MULTICIRCUIT CORE NUCLEAR POWER PLANT  

Microsoft Academic Search

A digital computer analysis of the loss-of-coolant accident in the ; primary system of a multi-circuit core nuclear power plant in the event of a ; complete severance of a pressure or jumper tube is presented. The time-dependent ; mass, momentum, and energy balance differential equations are expressed in finite ; difference form and solved numerically on an IBM-7090 digital

Nahavandi

1962-01-01

298

Analysis of loss-of-coolant accident for MURR 30MW power-upgrade project using RELAP5\\/MOD2  

Microsoft Academic Search

This study is part of the preliminary safety analysis for the new power expansion project on the University of Missouri Research Reactor (MURR). The loss of coolant accident (LOCA), which is initiated by hypothetical pipe ruptures at the most adverse positions (V507 A B) in both the hot and cold legs of the primary coolant loop, is analyzed with the

1987-01-01

299

A strategic partnering framework analysis methodology for public-private partnerships  

Microsoft Academic Search

Purpose – The purpose of this paper is to view and analyse public-private partnerships (PPPs) under a strategic partnering approach between the key parties involved, i.e. public sector, private sector and lenders, and their business environment. Design\\/methodology\\/approach – A strategic partnering framework analysis methodology has been devised based on existing and well-known business strategic analysis tools (the political-economic-social-technological (PEST) and

Athena Roumboutsos; Nicola Chiara

2010-01-01

300

RASCAL (Radiological Assessment System for Consequence AnaLysis): A screening model for estimating doses from radiological accidents  

SciTech Connect

The Radiological Assessment System for Consequence AnaLysis (RASCAL) is a new MS-DOS-based dose assessment model which has been written for the US Nuclear Regulatory Commission for use during response to radiological emergencies. RASCAL is designed to provide crude estimates of the effects of an accident while the accident is in progress and only limited information is available. It has been designed to be very simple to use and to run quickly. RASCAL is unique in that it estimates the source term based on fundamental plant conditions and does not rely solely on release rate estimation (e.g., Ci/sec of I-131). Therefore, it can estimate consequences of accidents involving unmonitored pathways or projected failures. RASCAL will replace the older model, IRDAM. 6 refs.

Sjoreen, A.L.; Athey, G.F.; Sakenas, C.A.; McKenna, T.J.

1988-01-01

301

Methodology for Computer-Aided Fault Tree Analysis  

Microsoft Academic Search

Fault tree analysis is a systematic, deductive and probabilistic risk assessment tool which elucidates the causal relations leading to a given undesired event. Quantitative fault tree (failure) analysis requires a fault tree and failure data of basic events. Development of a fault tree and subsequent analysis require a great deal of expertise, which may not be available all the time.

R. Ferdous; F. I. Khan; B. Veitch; P. R. Amyotte

2007-01-01

302

Aerodynamic configuration design using response surface methodology analysis  

NASA Technical Reports Server (NTRS)

An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

1993-01-01

303

A methodology for photovoltaic system reliability and economic analysis  

NASA Astrophysics Data System (ADS)

It is pointed out that the operation of large terrestrial photovoltaic (PV) power systems (over 50 peak kilowatts) is a fairly recent event. The present investigation provides a review of the characteristics of current and future PV systems and selects a methodology to allow the system designer to consider system reliability and maintenance parameters early in the design. The goal is to minimize the cost of system output energy over system life. Thus, cost per kW-hour is the objective function which PV system designers should minimize. Attention is given to an integrated reliability, availability, maintainability (RAM) model, which represents failure and repair, and provides the corrective maintenance portion of the yearly maintenance cost. The RAM model and a life-cycle energy cost model are combined to provide the information the designer needs for system optimization.

Stember, L. H.; Huss, W. R.; Bridgman, M. S.

1982-08-01

304

Analysis of changes in the pilot population and general aviation accidents.  

PubMed

General Aviation pilots have been involved in a steadily decreasing number of accidents over the past 20 years. Changes in the age distribution, certification, and flying habits of these pilots make direct comparison of accident statistics inaccurate. This study reviews changes in the pilot population over the past 20 years to analyze their impact on accident statistics. Pilot age and certificate distributions from 1968 to 1987 were assembled from annual Federal Aviation Administration (FAA) surveys. Information about pilots involved in accidents was collected from annual National Transportation Safety Board (NTSB) reports. Trends in pilot age distribution, certification, aircraft use, flight planning, and weather were reviewed. The accident experience from the first 5 years of the study period was used to construct an adjusted plot of expected aircraft accidents. From 1968-87, the mean pilot age increased from 35 to 40 years and the number of pilots over the age of 60 increased five-fold. The number of pilots with Air Transport Pilot (ATP) certification tripled and instrument certification increased 80%. Accidents where an Instrument Flight Rules (IFR) flight plan was filed increased from 3.6% to 6.6% without a corresponding increase in the number of accidents in weather at or below instrument meteorologic conditions (IMC). The accident experience from 1968 to 1973 predicted 116,000 accidents from 1968 to 1987. The actual number of accidents was 40% less than predicted. The average pilot age has increased both due to more pilots over the age of 50 and less young student pilots.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1550539

Bruckart, J E

1992-01-01

305

TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events  

SciTech Connect

The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

2013-11-10

306

Analysis of the Loss of Forced Reactor Coolant Flow Accident in SMART using RETRAN-03/INT  

SciTech Connect

Small and medium integral type nuclear reactors are getting much attention for the peaceful use of nuclear energy in non-electric area such as district heating, seawater desalination and ship propulsion. An integral type nuclear co-generation reactor, SMART(System-integrated Modular Advanced ReacTor, 330 MWt), has been developed by KAERI (Korea Atomic Energy Research Institute) since 1996. In this study, the safety analysis for SMART using modified RETRAN-03 code whose name is RETRAN-03/INT is performed to examine the applicability of RETRAN-03/INT code. For the safety analysis of integral reactor with helical-coiled steam generators, RETRAN-03 code has been modified and verified using experimental results. New heat transfer coefficients are added for helical-coiled steam generator. And, the heat transfer model for steam generator is modified due to the different primary and secondary side heat flow from U-tube type steam generator. The loss of forced reactor coolant flow accident is selected for safety analysis in this study. Also it is considered as a single failure that one of three trains of passive residual heat removal system is failed. The results from MARS/SMR code and RETRAN-03/INT code are compared. (authors)

Kim, Tae-Wan; Suh, Kune-Yull; Lee, Un-Chul; Park, Goon-Cherl [Department of Nuclear Engineering, Seoul National University, San 56-1, Shinlim-dong, Kwanak-gu, Seoul 151-742 (Korea, Republic of); Kim, Jae-Hak [Future and Challenge Co., LTD., 130-202, San 56-1, Shinlim-dong, Kwanak-gu, Seoul 151-742 (Korea, Republic of)

2002-07-01

307

Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology  

NASA Technical Reports Server (NTRS)

The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

Atkins, H. L.

1997-01-01

308

An Analysis on the Characteristics of Boiling Liquid Expanding Vapor Explosion Accidents in Marine Transportation  

Microsoft Academic Search

BLEVE is a kind of disaster that may cause serious consequences in the process of maritime transportation of liquefied petroleum gas, liquefied natural gas. To analyze the accident characteristics of both the external environment and the internal causes is the basic to prevent, control and research this accident. This paper analyzes the BLEVE happening overall process under the interaction of

Chen Sining; Duo Yinquan; Wei Lijun

2010-01-01

309

An exploratory multinomial logit analysis of single-vehicle motorcycle accident severity  

Microsoft Academic Search

Most previous research on motorcycle accident severity has focused on univariate relationships between severity and an explanatory variable of interest (e.g., helmet use). The potential ambiguity and bias that univariate analyses create in identifying the causality of severity has generated the need for multivariate analyses in which the effects of all factors that influence accident severity are considered. This paper

Venkataraman Shankar; Fred Mannering

1996-01-01

310

Debris interactions in reactor vessel lower plena during a severe accident II. Integral analysis  

Microsoft Academic Search

The integral physico-numerical model for the reactor vessel lower head response has been exercised for the TMI-2 accident and possible severe accident scenarios in PWR and BWR designs. The proposed inherent cooling mechanism of the reactor material creep and subsequent water ingression implemented in this predictive model provides a consistent representation of how the debris was finally cooled in the

Kune Y. Suh; Robert E. Henry

1996-01-01

311

Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices  

Microsoft Academic Search

This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This

J. Brown; L. H. J. Goossens; B. C. P. Kraan

1997-01-01

312

Analysis of Control Rod Ejection Accidents in Large Boiling Water Reactors.  

National Technical Information Service (NTIS)

Expressions are evaluated for the control rod velocity as the hydraulic pressure inside the control rod guide tube during a rod ejection accident and a rod drop accident. For the rod ejection transients, the effectiveness of the control rod velocity limit...

B. Thorlaksen

1976-01-01

313

Spatial Effects and Uncertainty Analysis for Rod Ejection Accidents in a PWR. International Agreement Report.  

National Technical Information Service (NTIS)

A rod ejection accident is a design-based event for a pressurized water reactor. It is well known that spatial effects play a very important role in this accident. In this study four cases using a model of Three Mile Island Unit 1 are considered: ejection...

A. A. Avvakumov V. Malofeev V. Sidorov

2007-01-01

314

Vergelijkende Analyse van Ongevallen met Zware Voertuigen (Comparative Analysis of Accidents with Heavy Vehicles).  

National Technical Information Service (NTIS)

The nature and extent of accidents with heavy vehicles: trucks and buses, is studied. Two earlier reports describe how heavy vehicles function in traffic, and present global data on accidents with heavy vehicles in the Netherlands, Europe and the United S...

J. P. M. Tromp

1989-01-01

315

Scale, population, and spatial analysis: a methodological investigation  

Microsoft Academic Search

The analysis of aggregated data introduces challenges in spatial analysis which has been described as the modifiable areal unit problem (MAUP). MAUP consists of two distinct but related research problems: scale effects and zoning effects. Various challenges accompany the science of aggregating phenomena into units that can be analyzed; the most salient of these challenges encompass theory, data sources, and

Darren M. Ruddell; Elizabeth A. Wentz

2007-01-01

316

On the Application of Syntactic Methodologies in Automatic Text Analysis.  

ERIC Educational Resources Information Center

Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

Salton, Gerard; And Others

1990-01-01

317

Maneuver analysis methodology to predict vehicle impacts on training lands  

Microsoft Academic Search

Tactical mobility analysis techniques were merged with land management strategies to assess potential impacts of vehicle operations on training areas for rangeland planning and management. A vehicle mobility analysis was performed for a suite of vehicle types using the NATO Reference Mobility Model (NRMM II). Input parameters include terrain information (soil type, slope, vegetation, surface roughness, soil strength), terrain surface

S. Shoop; R. Affleck; C. Collins; G. Larsen; L. Barna; P. Sullivan

2005-01-01

318

Methodology for computer aided fuzzy fault tree analysis  

Microsoft Academic Search

Probabilistic risk assessment (PRA) is a comprehensive, structured and logical analysis method aimed at identifying and assessing risks of complex process systems. PRA uses fault tree analysis (FTA) as a tool to identify basic causes leading to an undesired event, to represent logical dependency of these basic causes in leading to the event, and finally to calculate the probability of

Refaul Ferdous; Faisal Khan; Brian Veitch; Paul R. Amyotte

2009-01-01

319

Fault Tree Analysis: An Emerging Methodology for Instructional Science.  

ERIC Educational Resources Information Center

Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)

Wood, R. Kent; And Others

1979-01-01

320

A multivariate tobit analysis of highway accident-injury-severity rates.  

PubMed

Relatively recent research has illustrated the potential that tobit regression has in studying factors that affect vehicle accident rates (accidents per distance traveled) on specific roadway segments. Tobit regression has been used because accident rates on specific roadway segments are continuous data that are left-censored at zero (they are censored because accidents may not be observed on all roadway segments during the period over which data are collected). This censoring may arise from a number of sources, one of which being the possibility that less severe crashes may be under-reported and thus may be less likely to appear in crash databases. Traditional tobit-regression analyses have dealt with the overall accident rate (all crashes regardless of injury severity), so the issue of censoring by the severity of crashes has not been addressed. However, a tobit-regression approach that considers accident rates by injury-severity level, such as the rate of no-injury, possible injury and injury accidents per distance traveled (as opposed to all accidents regardless of injury-severity), can potentially provide new insights, and address the possibility that censoring may vary by crash-injury severity. Using five-year data from highways in Washington State, this paper estimates a multivariate tobit model of accident-injury-severity rates that addresses the possibility of differential censoring across injury-severity levels, while also accounting for the possible contemporaneous error correlation resulting from commonly shared unobserved characteristics across roadway segments. The empirical results show that the multivariate tobit model outperforms its univariate counterpart, is practically equivalent to the multivariate negative binomial model, and has the potential to provide a fuller understanding of the factors determining accident-injury-severity rates on specific roadway segments. PMID:22269492

Anastasopoulos, Panagiotis Ch; Shankar, Venky N; Haddock, John E; Mannering, Fred L

2012-03-01

321

Spatial analysis of image registration methodologies for fusion applications  

NASA Astrophysics Data System (ADS)

Data registration is the foundational step for fusion applications such as change detection, data conflation, ATR, and automated feature extraction. The efficacy of data fusion products can be limited by inadequate selection of the transformation model, or characterization of uncertainty in the registration process. In this paper, three components of image-to-image registration are investigated: 1) image correspondence via feature matching, 2) selection of a transformation function, and 3) estimation of uncertainty. Experimental results are presented for photogrammetric versus non-photogrammetric transfer of point features for four different sensor types and imaging geometries. The results demonstrate that a photogrammetric transfer model is generally more accurate at point transfer. Moreover, photogrammetric methods provide a reliable estimation of accuracy through the process of error propagation. Reliable local uncertainty derived from the registration process is particularly desirable information to have for subsequent fusion processes. To that end, uncertainty maps are generated to demonstrate global trends across the test images. Recommendations for extending this methodology to non-image data types are provided.

Doucette, Peter J.; Theiss, Henry J.; Mikhail, Edward M.; Motsko, Dennis J.

2011-05-01

322

Development of methodology for horizontal axis wind turbine dynamic analysis  

NASA Technical Reports Server (NTRS)

Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

Dugundji, J.

1982-01-01

323

Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

Sleight, David W.

1999-01-01

324

Operator error and system deficiencies: analysis of 508 mining incidents and accidents from Queensland, Australia using HFACS.  

PubMed

Historically, mining has been viewed as an inherently high-risk industry. Nevertheless, the introduction of new technology and a heightened concern for safety has yielded marked reductions in accident and injury rates over the last several decades. In an effort to further reduce these rates, the human factors associated with incidents/accidents needs to be addressed. A modified version of the Human Factors Analysis and Classification System was used to analyze incident and accident cases from across the state of Queensland to identify human factor trends and system deficiencies within mining. An analysis of the data revealed that skill-based errors were the most common unsafe act and showed no significant differences across mine types. However, decision errors did vary across mine types. Findings for unsafe acts were consistent across the time period examined. By illuminating human causal factors in a systematic fashion, this study has provided mine safety professionals the information necessary to reduce mine incidents/accidents further. PMID:20441855

Patterson, Jessica M; Shappell, Scott A

2010-07-01

325

Use of principal components analysis for reactor accident consequence evaluation and a comparison with other techniques  

SciTech Connect

The consequences of a potential reactor accident are normally characterized in terms of frequency distributions for exceeding specified surface air concentrations and deposition levels since these may be directly related to individual or population radiation exposures. Since an accidental release of radioactivity could occur at any time, the frequency distributions are determined by performing a large number of calculations that include a variety of possible release characteristics and meteorological situations. Performing such a large number of calculations is generally only feasible with relatively simple analytical models that utilize only the meteorological observations from the reactor site to describe the transport and dispersion of the radioactive material out to distances of about 100 km from the reactor. The purpose of this work was to investigate the possibility of utilizing three-dimensional models for consequence analysis, since these are capable of including meteorological data from multiple sites and the effects of topography on the transport and dispersion of airborne radioactivity over the region of concern. The approach to this problem was to investigate the feasibility of using the principal components analysis (PCA) technique for identifying wind patterns and their frequencies and temporal variations.

Gudiksen, P.H.; Walton, J.J.; Alpert, D.J.; Johnson, J.D.

1981-04-01

326

Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation  

NASA Technical Reports Server (NTRS)

A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

Campbell, Charles H.

2004-01-01

327

Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

2010-01-01

328

Homicide or accident off the coast of Florida: trauma analysis of mutilated human remains.  

PubMed

In the many years Dr. William R. Maples served as a forensic anthropologist, he saw diverse sources of trauma presented in the victims of violent crime, accident and suicide in the state of Florida. In 1996 the District 18 Medical Examiner's Office of Florida requested the assistance of Dr. Maples in the analysis of human remains recovered by the U.S. Coast Guard. The deceased was in an advanced state of decomposition characterized by skin slippage and discoloration. The torso bore multiple lacerations, including nearly parallel lacerations in the skin of the back. Specimens were carefully macerated and the fractures reconstructed. The skeletal trauma was caused by a device capable of delivering robust cuts and blunt trauma in linear paths, as is consistent with propeller trauma. Unusual in this case were blows to the ventral and dorsal surfaces of the body. Based on the anthropological analysis and interviews with the family of the deceased, the F.B.I. proceeded with the case as a homicide investigation. PMID:10432605

Stubblefield, P R

1999-07-01

329

Analysis on tank truck accidents involved in road hazardous materials transportation in china.  

PubMed

Objective: Due to the sheer size and capacity of the tanker and the properties of cargo transported in the tank, hazmat tanker accidents are more disastrous than other types of vehicle accidents. The aim of this study was to provide a current survey on the situation of accidents involving tankers transporting hazardous materials in China. Methods: Detailed descriptions of 708 tanker accidents associated with hazmat transportation in China from 2004 to 2011 were analyzed to identify causes, location, types, time of occurrence, hazard class for materials involved, consequences, and the corresponding probability. Results: Hazmat tanker accidents mainly occurred in eastern (38.1%) and southwest China (12.3%). The most frequent hazmat tanker accidents involved classes 2, 3, and 8. The predominant accident types were rollover (29.10%), run-off-the-road (16.67%), and rear-end collisions (13.28%), with a high likelihood of a large spill occurring. About 55.93% of the accidents occurred on freeways and class 1 roads, with the spill percentage reaching 75.00% and the proportion of spills that occurred in the total accidents amounting to 77.82%, of which 61.72% are considered large spills. The month with the highest accident probability was July (12.29%), and most crashes occurred during the early morning (4:00-6:00 a.m.) and midday (10:00 a.m.-12:00 p.m.) hours, 19.63% versus 16.10%. Human-related errors (73.8%) and vehicle-related defects (19.6%) were the primary reasons for hazmat tanker crashes. The most common outcomes of a hazmat tanker accident was a spill without further events (55.51%), followed by a release with fire (7.77%), and release with an explosion (2.54%). Conclusions: The safety situation of China's hazmat tanker transportation is grim. Such accidents not only have high spill percentages and consistently large spills but they can also cause serious consequences, such as fires and explosions. Improving the training of drivers and the quality of vehicles, deploying roll stability aids, enhancing vehicle inspection and maintenance, and developing good delivery schedules may all be considered effective measures for mitigating hazmat tanker accidents, especially severe crashes. PMID:24380669

Shen, Xiaoyan; Yan, Ying; Li, Xiaonan; Xie, Chenjiang; Wang, Lihua

2014-10-01

330

Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications  

PubMed Central

Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles.

Lourenco, Celia; Turner, Claire

2014-01-01

331

The impact of methodological factors on child psychotherapy outcome research: A meta-analysis for researchers  

Microsoft Academic Search

Two recent meta-analyses have generated evidence for child and adolescent psychotherapy effects. However, critics note that such meta-analyses often include studies with methodological shortcomings which might invalidate their results. In the present study, we explored whether the results of the most extensive child\\/adolescent meta-analysis might have been influenced by such methodological variables, focusing on internal validity and external validity factors.

Bahr Weiss; John R. Weisz

1990-01-01

332

The AECL`s research reactor analysis methodology  

Microsoft Academic Search

As the cost of developing completely new computer codes becomes prohibitive, designers of nuclear facilities are turning to more cost-effective approaches for meeting increasingly strict regulatory requirements applied to safety-related analysis. For designing and licensing the MAPLE family of research reactors, Atomic Energy of Canada Ltd. (AECL) is employing the strategy of adapting major existing codes by linking them together

Wilkin

1995-01-01

333

A fault injection analysis of Virtex FPGA TMR design methodology  

Microsoft Academic Search

This paper presents the meaningful results of a single bit upset fault injection analysis performed in Virtex FPGA triple modular redundancy (TMR) design. Each programmable bit upset able to cause an error in the TMR design has been investigated. Final conclusion using the TMR \\

F. Lima; C. Carmichaell; J. Fabula; R Padovanil; R. Reis

2001-01-01

334

Application of path analysis methodology to transit system maintenance performance  

Microsoft Academic Search

This paper develops a conceptual framework for bus maintenance based on path analysis and applies it to forty-eight bus transit systems. The application determines the total, direct, and indirect effects of the variables identified as having significant causal links with maintenance cost per mile. These variables are identified using the stepwise regression method. The findings are that the wage rate

Kofi Obeng

1988-01-01

335

A methodology for stock market analysis utilizing rough set theory  

Microsoft Academic Search

Quants are aiding brokers and investment managers for stock market analysis and prediction. The Quant's black magic stems from many of the evolving artificial intelligence (AI) techniques. Extensive literature exists describing attempts to use AI techniques, and in particular neural networks, for analyzing stock market variations. The main problem with neural networks, however is the tremendous difficulty in interpreting the

Robert H. Golan; W. Ziarko

1995-01-01

336

Fault Tree Analysis: An emerging methodology for instructional science  

Microsoft Academic Search

Fault Tree Analysis is a systematic approach to improving the probability of succes in any system. FTA was first developed as part of the U.S. Space Industry and was applied to such programs as the Minute Man Missile evaluations. Kent G. Stephens has successfully applied the technique to instructional and administrative programs, the latest program being the development of an

R. Kent Wood; Kent G. Stephens; Bruce O. Barker

1979-01-01

337

Methodological Approach for Conducting a Business Case Analysis for the Advanced Technology Ordnance Surveillance (ATOS) Advanced Concept Technology Demonstration (ACTD).  

National Technical Information Service (NTIS)

The purpose of this thesis is to provide a methodological approach for conducting a Business Case Analysis (BCA) for the Advanced Technology Ordnance Surveillance (ATOS) Advanced Concept Technology Demonstration (ACTD). This study provides a methodology f...

G. E. Kratzer

2005-01-01

338

Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.  

PubMed

The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents. PMID:18329391

Li, Wen-Chin; Harris, Don; Yu, Chung-San

2008-03-01

339

Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology  

NASA Technical Reports Server (NTRS)

The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

Knight, Norman F.

1998-01-01

340

Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis  

NASA Technical Reports Server (NTRS)

This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

Babcock, P.; Schor, A.; Rosch, G.

1998-01-01

341

Lower head creep rupture failure analysis associated with alternative accident sequences of the Three Mile Island Unit 2  

SciTech Connect

The objective of this lower head creep rupture analysis is to assess the current version of MELCOR 1.8.5-RG against SCDAP/RELAP5 MOD 3.3kz. The purpose of this assessment is to investigate the current MELCOR in-vessel core damage progression phenomena including the model for the formation of a molten pool. The model for stratified molten pool natural heat transfer will be included in the next MELCOR release. Presently, MELCOR excludes the gap heat-transfer model for the cooling associated with the narrow gap between the debris and the lower head vessel wall. All these phenomenological models are already treated in SCDAP/RELAP5 using the COUPLE code to model the heat transfer of the relocated debris with the lower head based on a two-dimensional finite-element-method. The assessment should determine if current MELCOR capabilities adequately cover core degradation phenomena appropriate for the consolidated MELCOR code. Inclusion of these features should bring MELCOR much closer to a state of parity with SCDAP/RELAP5 and is a currently underway element in the MELCOR code consolidation effort. This assessment deals with the following analysis of the Three Mile Island Unit 2 (TMI-2) alternative accident sequences. The TMI-2 alternative accident sequence-1 includes the continuation of the base case of the TMI-2 accident with the Reactor Coolant Pumps (RCP) tripped, and the High Pressure Injection System (HPIS) throttled after approximately 6000 s accident time, while in the TMI-2 alternative accident sequence-2, the reactor coolant pumps is tripped after 6000 s and the HPIS is activated after 12,012 s. The lower head temperature distributions calculated with SCDAP/RELAP5 are visualized and animated with open source visualization freeware 'OpenDX'. (author)

Sang Lung, Chan [Swiss Federal Institute of Technology Zurich and Swiss Federal Nuclear Safety Inspectorate, Zurich, Switzerland, 8001 (Switzerland)

2004-07-01

342

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

343

Thermoluminescence Dose Response: Experimental Methodology, Data Analysis, Theoretical Interpretation  

NASA Astrophysics Data System (ADS)

The parameters, Dth, Dc, Dm, f(D) and f(D)max., describing the characteristics of TL dose response are defined and a short survey of the literature concerning the dose response of the major TL glow peaks in LiF:Mg,Ti (TLD-100), peaks 5, 7 and 8 is presented. The experimental parameters and details of the analysis affecting the dose response are outlined. The importance of theoretical interpretation of the dose response in the determination of the dose response parameters is demonstrated and an in-depth introduction to the Unified Interaction Model is described. The dose response as a function of photon energy is analysed for peaks 5, 7 and 8 and the impact of the method of data analysis on the description of f(D) and especially the determination of Dc is emphasized.

Horowitz, Yigal S.; Datz, Hanan

2011-05-01

344

Market analysis methodology: a utility case study. Final report  

SciTech Connect

The case study described in this report was conducted as part of EPRI Project RP1634 - Analytic Methods Used Outside the Electric Utility Industry. The primary objectives of the project were to: (1) explore planning and analysis techniques in use outside the utility industry, (2) identify those techniques which show promise for addressing utility issues, and (3) test them in actual utility situations to understand their real value, and the issues associated with adapting them to utility use.

Diamond, M.

1985-02-01

345

Analysis methodology for the post-trip return to power steam line break event.  

National Technical Information Service (NTIS)

An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different f...

C. S. Lee C. W. Kim H. K. You

1996-01-01

346

CPAM: a common power analysis methodology for high-performance VLSI design  

Microsoft Academic Search

A common power analysis methodology has been developed to estimate on-chip thermal power, identify potential electromigration problems, and minimize power supply noise. Comprehensive analysis results from CPAM provide the critical data needed to improve system performance and reliability for high-end processor design

J. S. Neely; H. H. Chen; S. G. Walker; J. Venuto; T. J. Bucelot

2000-01-01

347

Seismic damage hazard analysis for requalification of nuclear power plant structures: methodology and application  

Microsoft Academic Search

A methodology for the evaluation of the annual probability of occurrence of post-elastic seismic damage in realistic structures is presented. The seismic damage hazard analysis (SDHA) is carried out here by coupling conventional seismic hazard analysis (SHA) for the site and the structural response to earthquakes of different intensities. The structural performance is statistically investigated by conducting appropriate non-linear dynamic

P. Bazzurro; C. A. Cornell; D. Diamantidis; G. M. Manfredini

1996-01-01

348

Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool.  

National Technical Information Service (NTIS)

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SN...

I. K. Madni F. Eltawila

1994-01-01

349

Comprehensive Analysis of General Aviation Accidents Volume 2: Pilot Experience and Aircraft Complexity.  

National Technical Information Service (NTIS)

The research team at Embry-Riddle Aeronautical University (ERAU) conducted a series of analyses to find patterns and associations among general aviation (GA) accidents. This research is intended to provide the Federal Aviation Administration (FAA) with an...

A. Singh H. Kosalim M. Bazargan M. Williams

2012-01-01

350

Comprehensive Analysis of General Aviation Accidents Volume 1: Trends, Distributions, and Causes.  

National Technical Information Service (NTIS)

Embry-Riddle Aeronautical University (ERAU) conducted a series of analyses to find patterns and associations among general aviation (GA) accidents. This research is intended to provide the Federal Aviation Administration (FAA) with analyses of Fatal, Seri...

A. Singh H. Kosalim M. Bazargan M. Williams

2012-01-01

351

Hypothetical accident condition thermal analysis and testing of a Type B drum package.  

National Technical Information Service (NTIS)

A thermophysical property model developed to analytically determine the thermal response of cane fiberboard when exposed to temperatures and heat fluxes associated with the 10 CFR 71 hypothetical accident condition (HAC) has been benchmarked against two T...

S. J. Hensel M. N. Alstine R. J. Gromada

1995-01-01

352

Application of Control Chart Techniques to the Analysis of Traffic Accident Data for Selective Enforcement Purposes.  

National Technical Information Service (NTIS)

The report contains a development of specialized control chart techniques together with various illustrations of the techniques utilizing traffic accident data from a section of California Interstate 5. More specifically, in Chapter I statistical control ...

G. R. Fisher W. W. Mosher

1968-01-01

353

A QUALITATIVE APPROACH TO UNCERTAINTY ANALYSIS FOR THE PWR ROD EJECTION ACCIDENT  

SciTech Connect

In order to understand best-estimate calculations of the peak local fuel enthalpy during a rod ejection accident, an assessment of the uncertainty has been completed. The analysis took into account point kinetics parameters which would be available from a three-dimensional core model and engineering judgment as to the uncertainty in those parameters. Sensitivity studies to those parameters were carried out using the best-estimate code PARCS. The results showed that the uncertainty (corresponding to one standard deviation) in local fuel enthalpy would be determined primarily by the uncertainty in ejected rod worth and delayed neutron fraction. For an uncertainty in the former of 8% and the latter of 5%, the uncertainty in fuel enthalpy varied from 51% to 69% for control rod worth varying from $1.2 to $1.0. Also considered in the uncertainty were the errors introduced by uncertainties in the Doppler reactivity coefficient, the fuel pellet specific heat, and assembly and fuel pin peaking factors.

DIAMOND,D.J.; ARONSON,A.; YANG,C.

2000-06-19

354

Analysis of Kuosheng Station Blackout Accident Using MELCOR 1.8.4  

Microsoft Academic Search

The MELCOR code, developed by Sandia National Laboratories, is a fully integrated, relatively fast-running code that models the progression of severe accidents in commercial light water nuclear power plants (NPPs).A specific station blackout (SBO) accident for Kuosheng (BWR-6) NPP is simulated using the MELCOR 1.8.4 code. The MELCOR input deck for Kuosheng NPP is established based on Kuosheng NPP design

S.-J. Wang; C.-S. Chien; T.-C. Wang; K.-S Chiang

2000-01-01

355

Sleep, watchkeeping and accidents: a content analysis of incident at sea reports  

Microsoft Academic Search

The unique profession of seafaring involves rest and sleep in a 24-h-a-day work environment that usually involves time-zone crossings, noise, heat, cold and motion. Sleep under such conditions is often difficult to obtain, and sleeping and sleep loss are often related to fatigue and contributory to accidents. This study aims to determine how accident investigators report sleep in Incident at

Richard Phillips

2000-01-01

356

Accident analysis of large-scale technological disasters applied to an anaesthetic complication  

Microsoft Academic Search

The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development\\u000a of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia.\\u000a We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen\\u000a describes two kinds of

Chris J. Eagle; Jan M. Davies; J. Reason

1992-01-01

357

Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications  

NASA Technical Reports Server (NTRS)

An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

Taylor, Arthur C., III; Hou, Gene W.

1996-01-01

358

The AECL`s research reactor analysis methodology  

SciTech Connect

As the cost of developing completely new computer codes becomes prohibitive, designers of nuclear facilities are turning to more cost-effective approaches for meeting increasingly strict regulatory requirements applied to safety-related analysis. For designing and licensing the MAPLE family of research reactors, Atomic Energy of Canada Ltd. (AECL) is employing the strategy of adapting major existing codes by linking them together within networks of custom-built interface software. This approach builds on the international investment in developing, maintaining, and verifying existing primary codes and focuses on the less onerous development of interface codes. The resultant code systems are then validated for the new applications of interest.

Wilkin, G.B. [Atomic Energy of Canada Ltd., Manitoba (Canada)

1995-12-31

359

Finite element methodology for integrated flow-thermal-structural analysis  

NASA Technical Reports Server (NTRS)

Papers entitled, An Adaptive Finite Element Procedure for Compressible Flows and Strong Viscous-Inviscid Interactions, and An Adaptive Remeshing Method for Finite Element Thermal Analysis, were presented at the June 27 to 29, 1988, meeting of the AIAA Thermophysics, Plasma Dynamics and Lasers Conference, San Antonio, Texas. The papers describe research work supported under NASA/Langley Research Grant NsG-1321, and are submitted in fulfillment of the progress report requirement on the grant for the period ending February 29, 1988.

Thornton, Earl A.; Ramakrishnan, R.; Vemaganti, G. R.

1988-01-01

360

Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.  

PubMed

A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

2005-01-01

361

Integrated modeling and analysis methodology for precision pointing applications  

NASA Astrophysics Data System (ADS)

Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

Gutierrez, Homero L.

2002-07-01

362

Probabilistic design analysis using Composite Loads Spectra (CLS) coupled with Probabilistic Structural Analysis Methodologies (PSAM)  

NASA Technical Reports Server (NTRS)

Composite loads spectra (CLS) were applied to generate probabilistic loads for use in the PSAM nonlinear evaluation of stochastic structures under stress (NESSUS) finite element code. The CLS approach allows for quantifying loads as mean values and distributions around a central value rather than maximum or enveloped values typically used in deterministic analysis. NESSUS uses these loads to determine mean and perturbation responses. These results are probabilistically evaluated with the distributional information from CLS using a fast probabilistic integration (FPI) technique to define response distributions. The main example discussed describes a method of obtaining load descriptions and stress response of the second-stage turbine blade of the Space Shuttle Main Engine (SSME) high-pressure fuel turbopump (HPFTP). Additional information is presented on the on-going analysis of the high pressure oxidizer turbopump discharge duct (HPOTP) where probabilistic dynamic loads have been generated and are in the process of being used for dynamic analysis. Example comparisons of load analysis and engine data are furnished for partial verification and/or justification for the methodology.

Newell, J. F.; Rajagopal, K. R.; Ho, H.

1989-01-01

363

An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.  

SciTech Connect

This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

2003-09-01

364

Accident investigation  

NASA Technical Reports Server (NTRS)

The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

Laynor, William G. Bud

1987-01-01

365

A faster reactor transient analysis methodology for PCs  

SciTech Connect

The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report.

Ott, K.O. (Purdue Univ., Lafayette, IN (United States). School of Nuclear Engineering)

1991-10-01

366

Methodology for analysis and simulation of large multidisciplinary problems  

NASA Technical Reports Server (NTRS)

The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

1989-01-01

367

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analyses is presented. New thermal finite elements which yield exact nodal and element temperature for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal-structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

368

Analysis methodology and recent results of the IGS network combination  

NASA Astrophysics Data System (ADS)

A working group of the International GPS Service (IGS) was created to look after Reference Frame (RF) issues and contribute to the densification and improvement of the International Terrestrial Reference Frame (ITRF). One important objective of the Reference Frame Working Group is to generate consistent IGS station coordinates and velocities, Earth Rotation Parameters (ERP) and geocenter estimates along with the appropriate covariance information. These parameters have a direct impact on other IGS products such as the estimation of GPS satellite ephemerides, as well as satellite and station clocks. The information required is available weekly from the Analysis Centers (AC) (cod, emr, esa, gfz, jpl, ngs, sio) and from the Global Network Associate Analysis Centers (GNAAC) (JPL, mit, ncl) using a "Software Independent Exchange Format" (SINEX). The AC are also contributing daily ERPs as part of their weekly submission. The procedure in place simultaneously combines the weekly station coordinates, geocenter and daily ERP estimates. A cumulative solution containing station coordinates and velocity is also updated with each weekly combination. This provides a convenient way to closely monitor the quality of the estimated station coordinates and to have an up to date cumulative solution available at all times. To provide some necessary redundancy, the weekly station coordinates solution is compared against the GNAAC solutions. Each of the 3 GNAAC uses its own software, allowing independent verification of the combination process. The RMS of the coordinate differences in the north, east and up components between the AC/GNAAC and the ITRF97 Reference Frame Stations are 4-10 mm, 5-20 mm and 6-25 mm. The station velocities within continental plates are compared to the NNR-NUVEL1A plate motion model (DeMets et al., 1994). The north, east and up velocity RMS are 2 mm/y, 3 mm/y and 8 mm/y. Note that NNR-NUVEL1A assumes a zero vertical velocity.

Ferland, R.; Kouba, J.; Hutchison, D.

2000-11-01

369

Accident analysis and control options in support of the sludge water system safety analysis  

Microsoft Academic Search

A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container.

2003-01-01

370

Acid rain research: a review and analysis of methodology  

SciTech Connect

The acidic deposition phenomena, when implicated as a factor potentially responsible for crop and forest yield losses and destruction of aquatic life, has gained increasing attention. The widespread fear that acid rain is having or may have devastating effects has prompted international debates and legislative proposals. An analysis of research on the effects of acid rain, however, reveals serious questions concerning the applicability and validity of conclusions of much of the work and thus conclusive estimations of impacts are lacking. In order to establish cause-effect relationships between rain acidity and the response of a receptor, controlled studies are necessary to verify observations in the field since there are many natural processes that produce and consume acidity and because numerous other environmental variables affect ecosystem response. Only when the response of an entire system is understood (i.e., interactions between plant, soil, soil microbes, and groundwater) can economic impacts be assessed and tolerance thresholds established for the wet deposition of acids. 14 references, 5 figures, 1 table.

Irving, P.M.

1983-01-01

371

Robust methodology for fractal analysis of the retinal vasculature.  

PubMed

We have developed a robust method to perform retinal vascular fractal analysis from digital retina images. The technique preprocesses the green channel retina images with Gabor wavelet transforms to enhance the retinal images. Fourier Fractal dimension is computed on these preprocessed images and does not require any segmentation of the vessels. This novel technique requires human input only at a single step; the allocation of the optic disk center. We have tested this technique on 380 retina images from healthy individuals aged 50+ years, randomly selected from the Blue Mountains Eye Study population. To assess its reliability in assessing retinal vascular fractals from different allocation of optic center, we performed pair-wise Pearson correlation between the fractal dimension estimates with 100 simulated region of interest for each of the 380 images. There was Gaussian distribution variation in the optic center allocation in each simulation. The resulting mean correlation coefficient (standard deviation) was 0.93 (0.005). The repeatability of this method was found to be better than the earlier box-counting method. Using this method to assess retinal vascular fractals, we have also confirmed a reduction in the retinal vasculature complexity with aging, consistent with observations from other human organ systems. PMID:20851791

Azemin, M Z Che; Kumar, D K; Wong, T Y; Kawasaki, R; Mitchell, P; Wang, J J

2011-02-01

372

The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications  

NASA Technical Reports Server (NTRS)

The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

Evers, Ken H.; Bachert, Robert F.

1987-01-01

373

Accident analysis of large-scale technological disasters applied to an anaesthetic complication.  

PubMed

The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

Eagle, C J; Davies, J M; Reason, J

1992-02-01

374

Methodology for object-oriented real-time systems analysis and design: Software engineering  

NASA Technical Reports Server (NTRS)

Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

Schoeffler, James D.

1991-01-01

375

Analysis of an unmitigated large break loss of coolant accident (LBLOCA) with the non-mechanistic failure of passive cooling for the APT Spallation Target  

Microsoft Academic Search

In order to support the Programmatic Environmental Impact Statement, an accident analysis has been performed for the Accelerator Production of Tritium (APT) Spallation-Induced Lithium Conversion (SILC) Source. This report presents a lumped-parameter analysis that predicts the thermal response of the source to a large-break LOCA. The accident scenario assumes the break to occur in the cold leg outside the source

N. K. Tutu; G. A. Greene; R. W. Youngblood

1995-01-01

376

Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.  

SciTech Connect

Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

Salay, Michael (United States Nuclear Regulatory Commission, Washington, D.C.); Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

2008-10-01

377

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

378

Analysis of loss-of-coolant and loss-of-flow accidents in the first wall cooling system of NET/ITER  

NASA Astrophysics Data System (ADS)

This paper presents the thermal-hydraulic analysis of potential accidents in the first wall cooling system of the Next European Torus or the International Thermonuclear Experimental Reactor. Three ex-vessel loss-of-coolant accidents, two in-vessel loss-of-coolant accidents, and three loss-of-flow accidents have been analyzed using the thermal-hydraulic system analysis code RELAP5/MOD3. The analyses deal with the transient thermal-hydraulic behavior inside the cooling systems and the temperature development inside the nuclear components during these accidents. The analysis of the different accident scenarios has been performed without operation of emergency cooling systems. The results of the analyses indicate that a loss of forced coolant flow through the first wall rapidly causes dryout in the first wall cooling pipes. Following dryout, melting in the first wall starts within about 130 s in case of ongoing plasma burning. In case of large break LOCAs and ongoing plasma burning, melting in the first wall starts about 90 s after accident initiation.

Komen, E. M. J.; Koning, H.

1994-03-01

379

What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?  

PubMed

Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such as in-depth accident investigations and pre-crash data recordings. PMID:23314359

Tivesten, Emma; Wiberg, Henrik

2013-03-01

380

Development of a new methodology for stability analysis in BWR NPP  

SciTech Connect

In this work, a new methodology to reproduce power oscillations in BWR NPP is presented. This methodology comprises the modal analysis techniques, the signal analysis techniques and the simulation with the coupled code RELAP5/PARCSv2.7. Macroscopic cross sections are obtained by using the SIMTAB methodology, which is fed up with CASMO-4/SIMULATE-3 data. The input files for the neutronic and thermohydraulic codes are obtained automatically and the thermalhydraulic-to-neutronic representation (mapping) used is based on the fundamental, first and second harmonics shapes of the reactor power, calculated with the VALKIN code (developed in UPV). This mapping was chosen in order not to condition the oscillation pattern. To introduce power oscillations in the simulation a new capability in the coupled code, for generate density perturbations (both for the whole core and for chosen axial levels) according with the power modes shapes, has been implemented. The purpose of the methodology is to reproduce the driving mechanism of the out of phase oscillations appeared in BWR type reactors. In this work, the methodology is applied to the Record 9 point, collected in the NEA benchmark of Ringhals 1 NPP. A set of different perturbations are induced in the first active axial level and the LPRM signals resulting are analyzed. (authors)

Garcia-Fenoll, M.; Abarca, A.; Barrachina, T.; Miro, R.; Verdu, G. [Inst. for Industrial, Radiophysical and Environmental Safety ISIRYM, Universitat Politecnica de Valencia, Cami de Vera s/n, 46021 Valencia (Spain)

2012-07-01

381

HPLC analysis of plant DNA methylation: a study of critical methodological factors  

Microsoft Academic Search

HPLC analysis of nucleosides is important for determining total DNA methylation in plants and can be used to help characterise epigenetic changes during stress, growth and development. This is of particular interest for in vitro plant cultures as they are highly susceptible to genetic change. HPLC methodologies have been optimised for mammalian and microbial DNA, but not for plants. This

Jason W. Johnston; Keith Harding; David H. Bremner; Graham Souch; Jon Green; Paul T. Lynch; Brian Grout; Erica E. Benson

2005-01-01

382

Success story in software engineering using NIAM (Natural language Information Analysis Methodology)  

SciTech Connect

To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

Eaton, S.M.; Eaton, D.S.

1995-10-01

383

Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys  

Cancer.gov

Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

384

Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology  

ERIC Educational Resources Information Center

Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

Johnson, Tristan E.; O'Connor, Debra L.

2008-01-01

385

Artificial neural networks classification and clustering of methodologies and applications - literature analysis from 1995 to 2005  

Microsoft Academic Search

Based on a scope of 10,120 articles on ANNs, this paper uses data mining including association rules and cluster analysis, to survey these ANNs papers through keyword classification and clustering of articles from 1995 to 2005, exploring the ANNs methodologies and application developments during that period. The four decision variables of keywords, author’s nationality, research category, and year of publication,

Shu-hsien Liao; Chih-hao Wen

2007-01-01

386

Instrumental Texture Profile Analysis (TPA) of Shelled Sunflower Seed Caramel Snack Using Response Surface Methodology  

Microsoft Academic Search

Models capable of predicting product quality of shelled sunflower seed caramel snack have been developed using response surface methodology. The textural profile analysis was conducted on the snacks using a texture analyzer. The quality attributes measured were hardness, cohesiveness, springiness, chewiness, and resilience as a function of sugar and sunflower kernels content. The sugar and shelled seed proportions affect the

R. K. Gupta; Alka Sharma; R. Sharma

2007-01-01

387

The State of Public Management Research: An Analysis of Scope and Methodology  

Microsoft Academic Search

In this article we examine the state of public management research, specifically focusing on the scope of research and variety of methodologies pursued in the field. We use a sample of manuscripts from three successive meetings of the Public Management Research Association to explore these issues. Our analysis is organized along four themes that have been central to public management's

David W. Pitts; Sergio Fernandez

2009-01-01

388

Task and Skill Analysis: A Methodology of Curricula Development for the Disadvantaged.  

ERIC Educational Resources Information Center

This document outlines training and educational problems confronting the trainee in private business and industry and recommends a methodology which can be used to develop the training/educational approach. Phase I is a labor market analysis, using Boston's Standard Metropolitan Statistical Area as an example of an area and of the kinds of data…

Centanni, Frederick A.

389

Human-Automated Judge Learning: A Methodology for Examining Human Interaction With Information Analysis Automation  

Microsoft Academic Search

Human-automated judge learning (HAJL) is a methodology providing a three-phase process, quantitative measures, and analytical methods to support design of information analysis automation. HAJL's measures capture the human and automation's judgment processes, relevant features of the environment, and the relationships between each. Specific measures include achievement of the human and the automation, conflict between them, compromise and adaptation by the

Ellen J. Bass; Amy R. Pritchett

2008-01-01

390

A comparison of segmental and wrist-to-ankle methodologies of bioimpedance analysis  

Microsoft Academic Search

The common approach of bioelectrical impedance analysis to estimate body water uses a wrist-to-ankle methodology which, although not indicated by theory, has the advantage of ease of application particularly for clinical studies involving patients with debilitating diseases. A number of authors have suggested the use of a segmented protocol in which the impedances of the trunk and limbs are measured

B. J. Thomas; B. H. Cornish; L. C. Ward; M. A. Patterson

1998-01-01

391

A fuzzy logic methodology for fault-tree analysis in critical safety systems  

Microsoft Academic Search

A new approach for fault-tree analysis in critical safety systems employing fuzzy sets for information representation is presented in this paper. The methodology is based on the utilization of the extension principle for mapping crisp measurements to various degrees of membership in the fuzzy set of linguistic Truth. Criticality alarm systems are used in miscellaneous nuclear fuel processing, handling, and

A. Erbay; A. Ikonomopoulos

1993-01-01

392

A 13Year Content Analysis of Survey Methodology in Communication Related Journals  

Microsoft Academic Search

Using content analysis, this study quantitatively analyzes 13 years (1990–2002) of data about survey research methodology in communication journals. Very little research has examined this important method since Yu and Cooper's 1983 study. Of the 54 journals included in our sample, 565 surveys were published in 46 journals. Public relations, marketing, public opinion, advertising and mass communication journals were among

Wendy Macias; Jeffrey K. Springston; Ruth Ann Lariscy Weaver; Benjamin Neustifter

2008-01-01

393

Probabilistic methodology for estimation of undiscovered petroleum resources in play analysis of the United States  

Microsoft Academic Search

A geostochastic system called FASPF was developed by the U.S. Geological Survey for their 1989 assessment of undiscovered petroleum resources in the United States. FASPF is a fast appraisal system for petroleum play analysis using a field-size geological model and an analytic probabilistic methodology. The geological model is a particular type of probability model whereby the volumes of oil and

Robert A. Crovelli

1992-01-01

394

State-of-the-art sustainability analysis methodologies for efficient decision support in green production operations  

Microsoft Academic Search

Over the last three decades, new concepts, strategies, frameworks and systems have been developed to tackle the sustainable development issue. This paper reviews the challenges, perspectives and recent advances in support of sustainable production operations decision-making. The aim of this review is to provide a holistic understanding of advanced scientific analysis methodologies for the evaluation of sustainability, to provide efficient

Shaofeng Liu; Mike Leat; Melanie Hudson Smith

2011-01-01

395

Identifying the Literacy Requirements of Jobs and Job Literacy Analysis: A New Methodology. Summary Version.  

ERIC Educational Resources Information Center

The Job Literacy Project used the same adult literacy framework as the National Assessment for Educational Progress (NAEP) research and began the process of applying the framework to the world of work. The NAEP framework consisted of three scales: prose, document, and quantitative. The project tested a new methodology called Job Literacy Analysis,…

Norback, Judith; And Others

396

Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design  

ERIC Educational Resources Information Center

Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

Tajino, Akira; James, Robert; Kijima, Kyoichi

2005-01-01

397

Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.  

ERIC Educational Resources Information Center

This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

2003-01-01

398

The effects of aircraft certification rules on general aviation accidents  

NASA Astrophysics Data System (ADS)

The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

Anderson, Carolina Lenz

399

Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II  

NASA Astrophysics Data System (ADS)

In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

Hu, G.; Zhao, S.; Ruan, K.

2012-01-01

400

Methodology of a combined ground based testing and numerical modelling analysis of supersonic combustion flow paths  

NASA Astrophysics Data System (ADS)

In the framework of the European Commission co-funded LAPCAT (Long-Term Advanced Propulsion Concepts and Technologies) project, the methodology of a combined ground-based testing and numerical modelling analysis of supersonic combustion flow paths was established. The approach is based on free jet testing of complete supersonic combustion ramjet (scramjet) configurations consisting of intake, combustor and nozzle in the High Enthalpy Shock Tunnel Göttingen (HEG) of the German Aerospace Center (DLR) and computational fluid dynamics studies utilising the DLR TAU code. The capability of the established methodology is demonstrated by applying it to the flow path of the generic HyShot II scramjet flight experiment configuration.

Hannemann, Klaus; Karl, Sebastian; Martinez Schramm, Jan; Steelant, Johan

2010-10-01

401

A substrate noise analysis methodology for large-scale mixed-signal ICs  

Microsoft Academic Search

A substrate noise analysis methodology is described that simulates substrate noise waveforms at sensitive locations of large-scale mixed-signal ICs. Simulation results for a 7.3 mm×7.3 mm chip with 500 k devices, obtained in a few hours on an engineering server, show good correlation with silicon measurements as testing conditions are varied. An analysis of the substrate and package reveals the

Wen Kung Chu; Nishath Verghese; Heayn-Jun Chol; Kenji Shimazaki; Hiroyuki Tsujikawa; Shouzou Hirano; Shirou Doushoh; Makoto Nagata; Atsushi Iwata; Takafumi Ohmoto

2003-01-01

402

A Practical Global Sensitivity Analysis Methodology for Multi-Physics Applications  

Microsoft Academic Search

This paper describes a global sensitivity analysis methodology for general multi-physics applications that are characterized\\u000a by strong nonlinearities and interactions in their input-output relationships, expensive simulation runs, and large number\\u000a of input parameters. We present a four-step approach consisting of (1) prescription of credible input ranges, (2) parameter\\u000a screening, (3) construction of response surfaces, and (4) quantitative sensitivity analysis on

C. Tong; F. Graziani

403

Supplementary documentation for an Environmental Impact Statement regarding the Pantex Plant: dispersion analysis for postulated accidents  

Microsoft Academic Search

This report documents work performed in support of preparation of an Environmental Impact Statement (EIS) regarding the Department of Energy (DOE) Pantex Plant near Amarillo, Texas. The report covers the calculation of atmospheric dispersion and deposition of plutonium following postulated nonnuclear detonations of nuclear weapons. Downwind total integrated air concentrations and ground deposition values for each postulated accident are presented.

J. M. Dewart; B. M. Bowen; J. C. Elder

1982-01-01

404

Containment accident analysis using CONTEMPT4\\/M0D2 compared with experimental data. [PWR  

Microsoft Academic Search

CONTEMPT4\\/MOD2 is a new computer program developed to predict the long-term thermal hydraulic behavior of light-water reactor and experimental containment systems during postulated loss-of-coolant accident (LOCA) conditions. Improvements over previous containment codes include multicompartment capability and ice condenser analytical models. A program description and comparisons of calculated results with experimental data are presented.

L. J. Metcalfe; D. W. Hargroves; R. A. Wells

1978-01-01

405

Independent assessment of MELCOR as a severe accident thermal-hydraulic\\/source term analysis tool  

Microsoft Academic Search

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to

I. K. Madni; F. Eltawila

1994-01-01

406

Independent assessment of MELCOR as a severe accident thermal-hydraulic\\/source term analysis tool  

Microsoft Academic Search

MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor (LWR) nuclear power plants and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories. Brookhaven National Laboratory (BNL) has a program with the NRC called MELCOR Verification, Benchmarking, and Applications, the aim of which

Madni

1995-01-01

407

MELCOR Analysis of Steam Generator Tube Creep Rupture in Station Blackout Severe Accident  

Microsoft Academic Search

A pressurized water reactor steam generator tube rupture (SGTR) is of concern because it represents a bypass of the containment for radioactive materials to the environment. In a station blackout accident, tube integrity could be threatened by creep rupture, particularly if cracks are present in the tube walls. Methods are developed herein to improve assessment capabilities for SGTR by using

Y. Liao; K. Vierow

2005-01-01

408

Analysis of Molten Fuel-Coolant Interaction During a Reactivity-Initiated Accident Experiment.  

National Technical Information Service (NTIS)

The results of a reactivity-initiated accident experiment, designated RIA-ST-4, are discussed and analyzed with regard to molten fuel-coolant interaction (MFCI). In this experiment, extensive amounts of molten UO sub 2 fuel and zircaloy cladding were prod...

M. S. El-Genk R. R. Hobbins

1981-01-01

409

Traffic accident in Cuiabá-MT: an analysis through the data mining technology.  

PubMed

The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males. PMID:20841739

Galvão, Noemi Dreyer; de Fátima Marin, Heimar

2010-01-01

410

Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents  

Microsoft Academic Search

This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests

S. Majumdar; D. R. Diercks; W. J. Shack

2002-01-01

411

An Analysis of Incident/Accident Reports from the Texas Secondary School Science Safety Survey, 2001  

ERIC Educational Resources Information Center

This study investigated safety in Texas secondary school science laboratory, classroom, and field settings. The Texas Education Agency (TEA) drew a random representative sample consisting of 199 secondary public schools in Texas. Eighty-one teachers completed Incident/Accident Reports. The reports were optional, anonymous, and open-ended; thus,…

Stephenson, Amanda L.; West, Sandra S.; Westerlund, Julie F.; Nelson, Nancy C.

2003-01-01

412

A systemic approach to accident analysis: A case study of the Stockwell shooting  

Microsoft Academic Search

This paper uses a systemic approach to accident investigation, based upon AcciMaps, to model the events leading up to the shooting of Jean Charles de Menezes at Stockwell Underground station in July 2005. The model captures many of the findings of the Independent Police Complaints Commission's report in a single representation, modelling their interdependencies and the causal flow. Furthermore, by

Daniel P. Jenkins; Paul M. Salmon; Neville A. Stanton; Guy H. Walker

2010-01-01

413

Analysis of general aviation accidents during operations under instrument flight rules  

NASA Technical Reports Server (NTRS)

A report is presented to describe some of the errors that pilots make during flight under IFR. The data indicate that there is less risk during the approach and landing phase of IFR flights, as compared to VFR operations. Single-pilot IFR accident rates continue to be higher than two-pilot IFR incident rates, reflecting the high work load of IFR operations.

Bennett, C. T.; Schwirzke, Martin; Harm, C.

1990-01-01

414

Analysis of loss-of-coolant accidents in the advanced neutron source reactor.  

National Technical Information Service (NTIS)

The RELAP5 computer code and a model of the Advanced Neutron Source (ANS) were used to simulate system response to hypothetical loss-of-coolant accidents (LOCAs). The computer code was modified to represent the thermal-hydraulic phenomena expected within ...

C. D. Fletcher L. S. Ghan

1990-01-01

415

Analysis of the Loss of Coolant Accident for Leu Cores of Pakistan Research Reactor-1.  

National Technical Information Service (NTIS)

Response of LEU cores for PARR-1 to a Loss of Coolant Accident (LOCA) has been studied. It has been assumed that pool water drains out due to double ended rupture of the primary coolant pipe or complete shearing off an experimental beam tube. Results show...

L. A. Khan I. H. Bokhari S. S. Raza

1993-01-01

416

General purpose heat source radioisotope thermoelectric generator. Book 2: Accident analysis, appendices  

NASA Astrophysics Data System (ADS)

The purpose is to document the accident scenarios and failure probabilities defined by NASA for the Space Transportation System (the Shuttle Data Book NSTS-08116 and supporting documentation) and used in the Ulysses Mission Safety Status Report (SSR). NASA utilized a systematic approach to identify the credible accident scenarios that might pose a threat to the radioisotope thermoelectric generator (RTG). First, the Shuttle system was divided into the following seven elements: (1) Launch Support Equipment (LSE); (2) Payload; (3) Orbiter; (4) External Tank (ET); (5) Solid Rocket Boosters (SRB); (6) Space Shuttle Main Engines (SSME); and (7) Range Safety System (RSS). Each element was further divided into its major components, and these components were then subdivided until all known failure modes were identified. The approach used to develop the different accident scenarios was to divide the mission into phases and subphases as necessary. The phases were keyed to specific events that resulted in significant changes in vehicle configuration and/or in the potential consequences to the RTG. After the phases were defined, the accident scenarios for each phase were analyzed by developing detailed fault trees for each of the seven major systems as applicable.

1990-01-01

417

Analysis of the TMI2 source range monitor during the TMI (Three Mile Island) accident  

Microsoft Academic Search

The source range monitor (SRM) data recorded during the first 4 hours of the Three Mile Island Unit No. 2 (TMI-2) accident following reactor shutdown were analyzed. An effort to simulate the actual SRM response was made by performing a series of neutron transport calculations. Primary emphasis was placed on simulating the changes in SRM response to various system events

Horng-Yu Wu; A. J. Baratta; Ming-Yuan Hsiao; B. R. Bandini

1987-01-01

418

Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.  

ERIC Educational Resources Information Center

Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

Dunwoody, Sharon; And Others

419

Two codes used in analysis of rod ejection accident for Qinshan Nuclear Power Plant.  

National Technical Information Service (NTIS)

Two codes were developed to analyse rod ejection accident for Qinshan Nuclear Power Plant. One was based on point model with temperature reactivity feedback. In this code, the worth of ejected rod was obtained under'adiabatic' approximation. In the other ...

X. Zhu

1987-01-01

420

Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes  

SciTech Connect

As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

1993-12-01

421

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)  

NASA Technical Reports Server (NTRS)

Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

Johnson, C. W.; Holloway, C, M.

2007-01-01

422

Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications  

NASA Technical Reports Server (NTRS)

In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

Taylor, Arthur C., III; Hou, Gene W.

1993-01-01

423

Performance analysis of complex repairable industrial systems using PSO and fuzzy confidence interval based methodology.  

PubMed

The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. PMID:23098922

Garg, Harish

2013-03-01

424

Risk of road accident associated with the use of drugs: a systematic review and meta-analysis of evidence from epidemiological studies.  

PubMed

This paper is a corrigendum to a previously published paper where errors were detected. The errors have been corrected in this paper. The paper is otherwise identical to the previously published paper. A systematic review and meta-analysis of studies that have assessed the risk of accident associated with the use of drugs when driving is presented. The meta-analysis included 66 studies containing a total of 264 estimates of the effects on accident risk of using illicit or prescribed drugs when driving. Summary estimates of the odds ratio of accident involvement are presented for amphetamines, analgesics, anti-asthmatics, anti-depressives, anti-histamines, benzodiazepines, cannabis, cocaine, opiates, penicillin and zopiclone (a sleeping pill). For most of the drugs, small or moderate increases in accident risk associated with the use of the drugs were found. Information about whether the drugs were actually used while driving and about the doses used was often imprecise. Most studies that have evaluated the presence of a dose-response relationship between the dose of drugs taken and the effects on accident risk confirm the existence of a dose-response relationship. Use of drugs while driving tends to have a larger effect on the risk of fatal and serious injury accidents than on the risk of less serious accidents (usually property-damage-only accidents). The quality of the studies that have assessed risk varied greatly. There was a tendency for the estimated effects of drug use on accident risk to be smaller in well-controlled studies than in poorly controlled studies. Evidence of publication bias was found for some drugs. The associations found cannot be interpreted as causal relationships, principally because most studies do not control very well for potentially confounding factors. PMID:22785089

Elvik, Rune

2013-11-01

425

Type Airman Certification as Related to Accidents.  

National Technical Information Service (NTIS)

An analysis of 1964 aircraft accidents, using type of airman certificate as a measure of pilot proficiency, is presented. Data show that student pilots generally have a better accident record than any other of the certification groups. Analysis confirmed ...

E. J. Veregge

1967-01-01

426

Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129  

NASA Astrophysics Data System (ADS)

Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

2014-05-01

427

Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.  

SciTech Connect

This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

2002-05-01

428

Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis.  

National Technical Information Service (NTIS)

With limited maintenance dedicated to aging dam spillway gate structures, there is an increased risk of gate inoperability and corresponding dam failure due to malfunction or inadequate design. This report summarizes research on methodologies to assist in...

R. C. Patev C. Putcha S. D. Foltz

2005-01-01

429

Methodological considerations for the harmonization of non-cholesterol sterol bio-analysis.  

PubMed

Non-cholesterol sterols (NCS) are used as surrogate markers of cholesterol metabolism which can be measured from a single blood sample. Cholesterol precursors are used as markers of endogenous cholesterol synthesis and plant sterols are used as markers of cholesterol absorption. However, most aspects of NCS analysis show wide variability among researchers within the area of biomedical research. This variability in methodology is a significant contributor to variation between reported NCS values and hampers the confidence in comparing NCS values across different research groups, as well as the ability to conduct meta-analyses. This paper summarizes the considerations and conclusions of a workshop where academic and industrial experts met to discuss NCS measurement. Highlighted is why each step in the analysis of NCS merits critical consideration, with the hopes of moving toward more standardized and comparable NCS analysis methodologies. Alkaline hydrolysis and liquid-liquid extraction of NCS followed by parallel detection on GC-FID and GC-MS is proposed as an ideal methodology for the bio-analysis of NCS. Furthermore the importance of cross-comparison or round robin testing between various groups who measure NCS is critical to the standardization of NCS measurement. PMID:24674990

Mackay, Dylan S; Jones, Peter J H; Myrie, Semone B; Plat, Jogchum; Lütjohann, Dieter

2014-04-15

430

Lung fiber analysis in accident victims: A biological assessment of general environmental exposures  

Microsoft Academic Search

Samples of lung tissue from 81 male, Canadian accident victims were examined for asbestos bodies by light microscopy. A subset of 65 cases was further analyzed for concentration (mean fibers longer than 5 um\\/mg dry lung) of 14 fiber types using transmission electron microscopy and energy dispersive spectrometry of x-rays at a detection limit of 0.062 fibers\\/mg. The preliminary report

B. W. Case; P. Sebastien; J. C. McDonald

2008-01-01

431

Analysis of Severe Accident Management Strategy for a BWR-4 Nuclear Power Plant  

SciTech Connect

The Chinshan nuclear power plant (NPP) is a Mark-I boiling water reactor (BWR) NPP located in northern Taiwan. The Chinshan NPP severe accident management guidelines (SAMGs) were developed based on the BWR Owners Group Emergency Procedure Guidelines/Severe Accident Guidelines and were developed at the end of 2003. The MAAP4 code has been used as a tool to validate the SAMG strategies. The development process and characteristics of the Chinshan SAMGs are described. The T{sub 5}U{sub t}X{sub C} sequence, the highest core damage frequency in the probabilistic risk assessment insight of the Chinshan NPP, is cited as a reference case for SAMG validation. Not all safety injection systems are operated in the T{sub 5}U{sub t}X{sub C} sequence. The severe accident progression is simulated, and the entry condition of the SAMGs is described. Then, the T{sub 5}U{sub t}X{sub C} sequence is simulated with reactor pressure vessel (RPV) depressurization. Mitigation actions based on the Chinshan NPP SAMGs are then applied to demonstrate the effectiveness of the SAMGs. Sensitivity studies on RPV depressurization with the reactor water level and minimum RPV injection flow rate are also investigated in this study. Based on MAAP4 calculation and the default values of the parameters calculating the severe accident phenomena, the result shows that RPV depressurization before the reactor water level reaches one-fourth of the core water level can prevent the core from damage in the T{sub 5}U{sub t}X{sub C} sequence. The flow rate of two control rod drive pumps is enough to maintain the reactor water level above the top of active fuel and cool down the core in the T{sub 5}U{sub t}X{sub C} sequence without operator action.

Wang, T.-C.; Wang, S.-J.; Teng, J.-T

2005-12-15

432

Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

1997-01-01

433

Development of reliable modeling methodologies for engine fan blade out containment analysis. Part II: Finite element analysis  

Microsoft Academic Search

In the first part of the paper [Naik D, Sankaran S, Mobasher B, Rajan SD, Pereira M. Development of reliable modeling methodologies for fan blade-out containment analysis. Part I: experimental studies. Int J Impact Eng, in press], details of the experiments to characterize the behavior of dry fabrics including Kevlar®49, and ballistic tests involving the fabric were presented. In this

Z. Stahlecker; B. Mobasher; S. D. Rajan; J. M. Pereira

2009-01-01

434

SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS  

SciTech Connect

The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

Brad J. Merrill; Shannon M Bragg-Sitton

2013-09-01

435

Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II  

NASA Astrophysics Data System (ADS)

The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

2014-03-01

436

Improved methodology for integral analysis of advanced reactors employing passive safety  

NASA Astrophysics Data System (ADS)

After four decades of experience with pressurized water reactors, a new generation of nuclear plants are emerging. These advanced designs employ passive safety which relies on natural forces, such as gravity and natural circulation. The new concept of passive safety also necessitates improvement in computational tools available for best-estimate analyses. The system codes originally designed for high pressure conditions in the presence of strong momentum sources such as pumps are challenged in many ways. Increased interaction of the primary system with the containment necessitates a tool for integral analysis. This study addresses some of these concerns. An improved tool for integral analysis coupling primary system with containment calculation is also presented. The code package is based on RELAP5 and CONTAIN programs, best-estimate thermal-hydraulics code for primary system analysis and containment code for containment analysis, respectively. The suitability is demonstrated with a postulated small break loss of coolant accident analysis of Westinghouse AP600 plant. The thesis explains the details of the analysis including the coupling model.

Muftuoglu, A. Kursad

437

Methodology for the characterization of water quality: Analysis of time-dependent variability  

NASA Astrophysics Data System (ADS)

The general methodology for characterization of water quality here presented was applied, after elimination of spatial effects, to the analysis of time-dependent variability of physico-chemical parameters measured, on eighteen dates, during the summer months of 1976, at 112 sampling stations on the Saint Lawrence River between Cornwall and Quebec City. Two aspects of water utilization are considered: domestic water-supply and capacity to sustain balanced aquatic life. The methodology, based on use and adaptation of classical multivariate statistical methods (correspondence analysis, hierarchical classification), leads, for a given type of water utilization, to the determination of the most important parameters, of their essential interrelations and shows the relative importance of their variations. Rationalization of network operations is thus obtained through identification of homogeneous behaviour periods as well as of critical dates for the measurement of parameters characterizing a given use.

Lachance, Marius; Bobée, Bernard

1982-11-01

438

A methodology for global-sensitivity analysis of time-dependent outputs in systems biology modelling  

PubMed Central

One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified.

Sumner, T.; Shephard, E.; Bogle, I. D. L.

2012-01-01

439

A methodology for global-sensitivity analysis of time-dependent outputs in systems biology modelling.  

PubMed

One of the main challenges in the development of mathematical and computational models of biological systems is the precise estimation of parameter values. Understanding the effects of uncertainties in parameter values on model behaviour is crucial to the successful use of these models. Global sensitivity analysis (SA) can be used to quantify the variability in model predictions resulting from the uncertainty in multiple parameters and to shed light on the biological mechanisms driving system behaviour. We present a new methodology for global SA in systems biology which is computationally efficient and can be used to identify the key parameters and their interactions which drive the dynamic behaviour of a complex biological model. The approach combines functional principal component analysis with established global SA techniques. The methodology is applied to a model of the insulin signalling pathway, defects of which are a major cause of type 2 diabetes and a number of key features of the system are identified. PMID:22491976

Sumner, T; Shephard, E; Bogle, I D L

2012-09-01

440

Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis  

PubMed Central

Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition.

Pazo, Jose A.; Granada, Enrique; Saavedra, Angeles; Patino, David; Collazo, Joaquin

2010-01-01

441

Highchair accidents.  

PubMed

In order to establish guidelines for highchair accident prevention we investigated causes, mode and complications of highchair accidents by the following methods: The charts of 103 children attending our Accident & Emergency department for highchair related injuries were studied retrospectively. Questionnaires were sent to the parents to obtain detailed information about the mode of accident. They were also asked to suggest preventive measures. In addition, a random sample survey was performed with 163 families inquiring about the rate of highchair use and the incidence of highchair related accidents. Of the 103 infants, 15.5% had sustained a skull fracture, 13.6% a brain concussion, 2.0% limb fractures and 68.9% a simple contusion of the head or lacerations to the scalp or face. The questionnaires were fully completed by 61.2% of parents. Every second family reported that their infant had tried to stand up in the highchair before falling off (only one child had been wearing a restraint). In a further 14.3% of accidents the highchair tipped over. Eighty-seven percent of parents would appreciate a pre-installation of restraints, 54.0% requested more informative instructions for users, and 33.3% asked for products with better stability. The random sample survey revealed a highchair use rate of 92%; 18% of families used highchairs equipped with restraints, and 6% reported highchair accidents sustained by their children. We conclude that most highchair accidents occur when unrestrained infants try to stand up. Pre-installed child restraints, better manuals for users and increased highchair stability should be recommended as promising accident prevention strategies. PMID:10229045

Mayr, J M; Seebacher, U; Schimpl, G; Fiala, F

1999-03-01

442

Classification and regression tree analysis in public health: Methodological review and comparison with logistic regression  

Microsoft Academic Search

Background: Audience segmentation strategies are of increasing interest to public health professionals who wish to identify easily defined,\\u000a mutually exclusive population subgroups whose members share similar characteristics that help determine participation in a\\u000a health-related behavior as a basis for targeted interventions. Classification and regression tree (C&RT) analysis is a nonparametric\\u000a decision tree methodology that has the ability to efficiently segment

Stephenie C. Lemon; Jason Roy; Melissa A. Clark; Peter D. Friedmann; William Rakowski

2003-01-01

443

A multi-scale segmentation\\/object relationship modelling methodology for landscape analysis  

Microsoft Academic Search

Natural complexity can best be explored using spatial analysis tools based on concepts of landscape as process continuums that can be partially decomposed into objects or patches. We introduce a five-step methodology based on multi-scale segmentation and object relationship modelling. Hierarchical patch dynamics (HPD) is adopted as the theoretical framework to address issues of heterogeneity, scale, connectivity and quasi-equilibriums in

C. Burnett; Thomas Blaschke

2003-01-01