These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade  

SciTech Connect

The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

Gregg L. Sharp; R. T. McCracken

2003-06-01

2

Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade  

SciTech Connect

The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

Sharp, G.L.; McCracken, R.T.

2003-05-13

3

SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.  

PubMed

As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study. PMID:11566400

Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

2001-10-12

4

SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system  

Microsoft Academic Search

As it is conventionally done, strategies for incorporating accident — prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps — risk assessment and hazard reduction (or safety) measures — are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety

Faisal I Khan; Asad Iqbal; N Ramesh; S. A Abbasi

2001-01-01

5

Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis  

SciTech Connect

A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force.

Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A. [Argonne National Lab., IL (United States); Jackson, R.; TenBrook, W.; Russell, J. [Science Applications International Corp., Golden, CO (United States)]|[Science Applications International Corp., Pleasanton, CA (United States)

1994-02-01

6

Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis  

Microsoft Academic Search

A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the

C. Mueller; J. Roglans-Ribas; S. Folga; A. Huttenga; R. Jackson; W. TenBrook; J. Russell

1994-01-01

7

Severe accident analysis using dynamic accident progression event trees  

NASA Astrophysics Data System (ADS)

In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a pressurized water reactor. The specific plant analyzed is the Zion Nuclear Power Plant, which is a Westinghouse-designed system that has been decommissioned.

Hakobyan, Aram P.

8

Applying STAMP in Accident Analysis  

NASA Technical Reports Server (NTRS)

Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

2003-01-01

9

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

1931-01-01

10

Deterministic accident analysis for RBMK  

Microsoft Academic Search

Within the framework of an European Commission sponsored activity, an assessment of the deterministic safety technology of the ‘post-Chernobyl modernized’ Reactor Bolshoy Moshchnosty Kipyashiy (RBMK) has been completed. The accident analysis, limited to the area of Design Basis Accident, constituted the key subject for the study; events not including the primary circuit were not considered, as well as events originated

F. D’Auria; B. Gabaraev; S. Soloviev; O. Novoselsky; A. Moskalev; E. Uspuras; G. M. Galassi; C. Parisi; A. Petrov; V. Radkevich; L. Parafilo; D. Kryuchkov

2008-01-01

11

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

1929-01-01

12

HANARO thermal hydraulic accident analysis.  

National Technical Information Service (NTIS)

For the safety assessment of HANARO, accident analyses for the anticipated operational transients, accident scenarios and limiting accident scenarios were conducted. To do this, the commercial nuclear reactor system code. RELAP5/MOD2 was modified to RELAP...

B. Y. Lee, C. Park, H. I. Kim, S. Y. Lee

1996-01-01

13

Analysis of accidents during flashing operations  

E-print Network

. In this thesis, the relative impacts of flashing signal operation versus regular signal operation were evaluated in several cities and towns in the State of Texas. Analysis were conducted to determine whether an increase in accidents and accident severity...

Obermeyer, Michael Edward

2012-06-07

14

Accident analysis and DOE criteria  

SciTech Connect

In analyzing the radiological consequences of major accidents at DOE facilities one finds that many facilities fall so far below the limits of DOE Order 6430 that compliance is easily demonstrated by simple analysis. For those cases where the amount of radioactive material and the dispersive energy available are enough for accident consequences to approach the limits, the models and assumptions used become critical. In some cases the models themselves are the difference between meeting the criteria or not meeting them. Further, in one case, we found that not only did the selection of models determine compliance but the selection of applicable criteria from different chapters of Order 6430 also made the difference. DOE has recognized the problem of different criteria in different chapters applying to one facility, and has proceeded to make changes for the sake of consistency. We have proposed to outline the specific steps needed in an accident analysis and suggest appropriate models, parameters, and assumptions. As a result we feed DOE siting and design criteria will be more fairly and consistently applied.

Graf, J.M.; Elder, J.C.

1982-01-01

15

Accident progression event tree analysis for postulated severe accidents at N Reactor  

SciTech Connect

A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. (Sandia National Labs., Albuquerque, NM (USA)); Medford, G.T. (Science Applications International Corp., Albuquerque, NM (USA))

1990-06-01

16

MELCOR accident analysis for ARIES-ACT  

E-print Network

MELCOR accident analysis for ARIES-ACT Paul Humrickhouse Brad Merrill INL Fusion Safety Program · Ultimate decay heat removal in an accident is removed via natural circulation in the water loop, which runs Flow Flow #12;Fusion Safety Program · MELCOR is a code originally designed to model severe accident

California at San Diego, University of

17

Probabilistic analysis of air carrier accidents  

NASA Technical Reports Server (NTRS)

In order to estimate the potential risks due to carbon fibers (CF) released from aircraft accidents, it was necessary to quantify the probability of an accident or incident at a major hub airport. This probability was contingent upon various conditions surrounding the incident including the phase of operation, aircraft type, and the weather conditions. The type of accident predicted was categorized according to its location relative to the runway and the severity of damage sustained. The methodology utilized to estimate the probability of a specific type of accident is outlined and the various models that were developed in the course of this work are described.

1979-01-01

18

Analysis of a fungal contamination accident at a public library in Rio de Janeiro  

Microsoft Academic Search

The applicability of a methodology to analyze large industrial accidents using social-technical analysis developed by the Center of Studies on Worker's Health and Human Ecology (CESTEH\\/Fiocruz) was tested in a fungal contamination accident occurred in December 1997 at a public library in Rio de Janeiro. The accident was due to problems in controlling the ambient temperature, which resulted in discomfort,

Maria Cristina Strausz; Jorge Mesquita Huet Machado

19

ACCIDENT ANALYSIS AND HAZARD ANALYSIS FOR HUMAN AND ORGANIZATIONAL FACTORS  

E-print Network

1 ACCIDENT ANALYSIS AND HAZARD ANALYSIS FOR HUMAN AND ORGANIZATIONAL FACTORS by MARGARET V #12;2 [Page intentionally left blank] #12;3 ACCIDENT ANALYSIS AND HAZARD ANALYSIS FOR HUMAN play in system safety during the development of these systems, accidents will occur. Safe design

Leveson, Nancy

20

Aircraft Loss-of-Control Accident Analysis  

NASA Technical Reports Server (NTRS)

Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

Belcastro, Christine M.; Foster, John V.

2010-01-01

21

A systems approach to food accident analysis  

E-print Network

Food borne illnesses lead to 3000 deaths per year in the United States. Some industries, such as aviation, have made great strides increasing safety through careful accident analysis leading to changes in industry practices. ...

Helferich, John D

2011-01-01

22

A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences  

SciTech Connect

This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

Budnitz, R.J.; Lambert, H.E.; Apostolakis, G. [and others] and others

1998-04-01

23

An analysis of aircraft accidents involving fires  

NASA Technical Reports Server (NTRS)

All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

1975-01-01

24

Accident analysis of nuclear power plants  

Microsoft Academic Search

Advanced methods of nuclear power plant accident analysis are aimed at ; assessing the risk arising from plant operation by use of probability ; consideration. The recent state of plant safety analysis is dealt with. This ; analysis includes recording of data gained from nuclear power plant operation and ; the assessment of malfunctions of components and systems. 27 references.

B. Kunze; H. Eichhorn

1973-01-01

25

HTGR severe accident sequence analysis  

SciTech Connect

Thermal-hydraulic, fission product transport, and atmospheric dispersion calculations are presented for hypothetical severe accident release paths at the Fort St. Vrain (FSV) high temperature gas cooled reactor (HTGR). Off-site radiation exposures are calculated for assumed release of 100% of the 24 hour post-shutdown core xenon and krypton inventory and 5.5% of the iodine inventory. The results show conditions under which dose avoidance measures would be desirable and demonstrate the importance of specific release characteristics such as effective release height. 7 tables.

Harrington, R.M.; Ball, S.J.; Kornegay, F.C.

1982-01-01

26

Anthropotechnological analysis of industrial accidents in Brazil.  

PubMed Central

The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

Binder, M. C.; de Almeida, I. M.; Monteau, M.

1999-01-01

27

Improved dose assessment in a nuclear reactor accident using the old and new ICRP methodologies  

E-print Network

IMPROVED DOSE ASSESSMENT IN A NUCLEAR REACTOR ACCIDENT USING THE OLD AND NEW ICRP METHODOLOGIES A Thesis by SUK-CHUL YOON Submitted to the Graduate College of Texas A ff M University in partial fulfillment of the requirements for the degree... of MASTER OF SCIENCE May 1987 Major Subject: Nuclear Engineering IMPROVED DOSE ASSESSMENT IN A NUCLEAR REACTOR ACCIDENT USING THE OLD AND NEW ICRP METHODOLOGIES A Thesis by SUK-CHUL YOON Approved as to style and content by: John W. Poston (Chairm...

Yoon, Suk-Chul

2012-06-07

28

Single pilot IFR accident data analysis  

NASA Technical Reports Server (NTRS)

The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

Harris, D. F.; Morrisete, J. A.

1982-01-01

29

A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT  

E-print Network

A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT Thesis submitted in partial fulfilment;A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT Paul S. Nelson 2 #12;Acknowledgements I want pressure" (Dekker, 2007, p. 131) A new, holistic systems perspective, accident model is used for analysis

Leveson, Nancy

30

The Methodology of Search Log Analysis  

E-print Network

99 Chapter VI The Methodology of Search Log Analysis Bernard J. Jansen Pennsylvania State of and foundation for conducting Web search transaction log analysis. A search log analysis methodology is outlined consisting of three stages (i.e., collection, preparation, and analysis). The three stages of the methodology

Jansen, James

31

Analysis of commercial mini-bus accidents  

Microsoft Academic Search

This paper presents a study of mini-bus traffic accidents aimed at gaining insight into the factors affecting accident occurrence and severity. Understanding these factors can help to bring forth realistic strategies to improve the safety of these buses. Two disaggregate models related to the time until accident occurrence and the number of accident injuries were specified and estimated. The models

Mohammad M. Hamed; A. S. Jaradat; Said M. Easa

1998-01-01

32

Canister Storage Building (CSB) Design Basis Accident Analysis Documentation  

SciTech Connect

This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

CROWE, R.D.; PIEPHO, M.G.

2000-03-23

33

Canister storage building design basis accident analysis documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

KOPELIC, S.D.

1999-02-25

34

Canister Storage Building (CSB) Design Basis Accident Analysis Documentation  

SciTech Connect

This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

CROWE, R.D.

1999-09-09

35

Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

PIEPHO, M.G.

1999-10-20

36

A multiple discriminant analysis of vessel accidents  

Microsoft Academic Search

A large sample of 936 vessel accident cases occurring between 1979 and 1987 on the lower Mississippi River were cluster analyzed to generate four groups relatively unique in their respective attribute values. The attributes used to cluster the accidents included participation in the U.S. Coast Guard's New Orleans Vessel Traffic Service (NOLA-VTS), type of accident, river stage, traffic level, system

Louis A. Le Blanc; Conway T. Rucks

1996-01-01

37

Dynamic analysis of the Chernobyl accident  

Microsoft Academic Search

On April 26, 1986, the worst accident in the nuclear industry occurred at the Chernobyl Unit 4 reactor in the USSR. Initially, many causes were postulated for the accident; ultimately, a reactivity transient was identified as the driver for the power buildup leading to the core disruptive event. After the accident, the Hanford N-Reactor was put under close scrutiny to

H. Toffer; R. W. Twitchell

1987-01-01

38

A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS  

SciTech Connect

The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

Palmrose, D E; Yang, J M

2007-05-10

39

PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS  

SciTech Connect

Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

(NOEMAIL), K; Jonathan Lowrie, J; David Thoman (NOEMAIL), D; Austin Keller (NOEMAIL), A

2008-07-30

40

Categorizing accident sequences in the external radiotherapy for risk analysis  

PubMed Central

Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005

2013-01-01

41

Cognitive failure analysis for aircraft accident investigation  

Microsoft Academic Search

The present studies were undertaken to investigate the applicability of an information processing approach to human failure in the aircraft cockpit. Using data obtained from official aircraft accident investigation reports, a database of accidents and incidents involving New Zealand civil aircraft between 1982 and 1991 was compiled. In the first study, reports were coded into one of three error stages

DAVID OHARE; MARK WIGGINS; RICHARD BATT; DIANNE MORRISON

1994-01-01

42

Aircraft accidents.method of analysis  

NASA Technical Reports Server (NTRS)

This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

1937-01-01

43

Analysis of accidents during instrument approaches  

NASA Technical Reports Server (NTRS)

General aviation and air taxi approach phase accidents, which occurred during VFR and IFR, respectively over the last 25 years, were analyzed. The data suggest that there is a 204 percent higher risk during the approach and landing phase of VFR flights, than during similar IFR operations (14.82 vs 7.27 accidents/100,000 approaches). Alarmingly, the night single pilot IFR (SPIFR) accident rate is almost 8 times the rate of day IFR, 35.43 vs 4.47 accidents/100,000 approaches, and two and a half times that of day VFR approaches, 35.43 vs 14.82 accidents/100,000 approaches. Surprisingly, the overall SPIFR accident rates are not much higher than dual-pilot IFR (DPIFR), 7.27 vs 6.48 accidents/100,000 approaches. The generally static ratio of the statistics for SPIFR/DPIFR accident rates may be accounted for by little or no change in general aviation cockpit technology during the last 25 years, and because IFR operational flight task management training has not kept pace.

Bennett, C. T.; Schwirzke, M.

1992-01-01

44

Methodology for Validating Building Energy Analysis Simulations  

SciTech Connect

The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

2008-04-01

45

Rat sperm motility analysis: methodologic considerations  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

46

Analysis of Credible Accidents for Argonaut Reactors  

SciTech Connect

Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

1981-04-01

47

The Analysis of a Friendly Fire Accident using a Systems Model of Accidents* N.G. Leveson, Ph.D.; Massachusetts Institute of Technology; Cambridge, Massachusetts  

E-print Network

The Analysis of a Friendly Fire Accident using a Systems Model of Accidents* N.G. Leveson, Ph.D.; University of Victoria; Victoria, Canada Keywords: accident analysis, accident models Abstract In another of the socio-technical control structure in which the accident occurred. In order to evaluate this model, we

Leveson, Nancy

48

Simulation analysis of productivity variation affected by accident risk in underground construction operations  

Microsoft Academic Search

The construction industry has had a disproportionately high rate of accident for its size. Accident statistics have played an important role as a prime indicator for measuring safety performance as well as a framework for evaluating accident prevention programs. However, the current system of statistics collection is based upon post-accident analysis. These data provide factual information regarding the post accident

Sangyoub Lee

2001-01-01

49

OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT  

SciTech Connect

This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

KRIPPS, L.J.

2005-02-18

50

The covariance between the number of accidents and the number of victims in multivariate analysis of accident related outcomes.  

PubMed

In this study some statistical issues involved in the simultaneous analysis of accident related outcomes of the road traffic process are investigated. Since accident related outcomes like the number of victims, fatalities or accidents show interdependencies, their simultaneous analysis requires that these interdependencies are taken into account. One particular interdependency is the number of fatal accidents that is always smaller than the number of fatalities as at least one fatality results from a fatal accident. More generally, when the number of accidents increases, the number of people injured as a result of these accidents will also increase. Since dependencies between accident related outcomes are reflected in the variance-covariance structure of the outcomes, the main focus of the present study is on establishing this structure. As this study shows it is possible to derive relatively simple expressions for estimates of the variances and covariances of (logarithms of) accidents and victim counts. One example reveals a substantial effect of the inclusion of covariance terms in the estimation of a confidence region of a mortality rate. The accuracy of the estimated variance-covariance structure of the accident related outcomes is evaluated using samples of real life accident data from The Netherlands. Additionally, the effect of small expected counts on the variance estimate of the logarithm of the counts is investigated. PMID:15949448

Bijleveld, F D

2005-07-01

51

Recent Methodological Progress in Cadmium Analysis  

Microsoft Academic Search

The state-of-the-art of methodology for cadmium analysis in biological and environmental materials on the basis of recent progress is discussed. There is a remarkable gain in sensitivity and reliability for atomic spectroscopy mainly for graphite furnace techniques but to some extent also for flame atomic absorption with Zeeman background correction. Further, the introduction of new commercial polarogrphic analyzers offering square

Markus Stoeppler

1986-01-01

52

Accident Sequence Evaluation Program: Human reliability analysis procedure  

SciTech Connect

This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

Swain, A.D.

1987-02-01

53

Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1  

SciTech Connect

NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

1993-12-01

54

MELCOR accident analysis for ARIES-ACT  

SciTech Connect

We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

Paul W. Humrickhouse; Brad J. Merrill

2012-08-01

55

Human factors review for Severe Accident Sequence Analysis (SASA)  

SciTech Connect

The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure.

Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

1984-01-01

56

Analysis of tritium mission FMEF\\/FAA fuel handling accidents  

Microsoft Academic Search

The Fuels Material Examination Facility\\/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium\\/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium

Van Keuren

1997-01-01

57

Three dimensional effects in analysis of PWR steam line break accident  

E-print Network

A steam line break accident is one of the possible severe abnormal transients in a pressurized water reactor. It is required to present an analysis of a steam line break accident in the Final Safety Analysis Report (FSAR) ...

Tsai, Chon-Kwo

58

INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS  

SciTech Connect

Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

D.A. Kalinich

1999-09-27

59

Theories of radiation effects and reactor accident analysis  

SciTech Connect

Muckerheide`s paper was a public breakthrough on how one might assess the public health effects of low-level radiation. By the organization of a wealth of data, including the consequences of Hiroshima and Nagasaki but not including Chernobyl, he was able to conclude that present radioactive waste disposal and cleanup efforts need to be much less arduous than forecast by the U.S. Department of Energy, which, together with regulators, uses the linear hypothesis of radiation damage to humans. While the linear hypothesis is strongly defended and even recommended for extension to noncarcinogenic pollutants, exploration of a conservative threshold for very low level exposures could save billions of dollars in disposing of radioactive waste, enhance the understanding of reactor accident consequences, and assist in the development of design and operating criteria pertaining to severe accidents. In this context, the authors discuss the major differences between design-basis and severe accidents. The authors propose that what should ultimately be done is to develop a regulatory formula for severe-accident analysis that relates the public health effects to the amount and type of radionuclides released and distributed by the Chernobyl accident. Answers to the following important questions should provide the basis of this study: (1) What should be the criteria for distinguishing between design-basis and severe accidents, and what should be the basis for these criteria? (2) How do, and should, these criteria differ for older plants, newer operating plants, type of plant (i.e., gas cooled, water cooled, and liquid metal), advanced designs, and plants of the former Soviet Union? (3) How safe is safe enough?

Williams, P.M. [Peter M. Williams, Potomac, MD (United States); Ball, S.J. [Oak Ridge National Laboratory, TN (United States)

1996-12-31

60

Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident  

NASA Technical Reports Server (NTRS)

Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

1994-01-01

61

Comparing a multi-linear (STEP) and systemic (FRAM) method for accident analysis  

Microsoft Academic Search

Accident models and analysis methods affect what accident investigators look for, which contributory factors are found, and which recommendations are issued. This paper contrasts the Sequentially Timed Events Plotting (STEP) method and the Functional Resonance Analysis Method (FRAM) for accident analysis and modelling. The main issue addressed in this paper is the comparison of the established multi-linear method STEP with

I. A. Herrera; R. Woltjer

2010-01-01

62

RAMS (Risk Analysis - Modular System) methodology  

SciTech Connect

The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

1996-10-01

63

Propulsion system failure analysis - A methodology  

NASA Astrophysics Data System (ADS)

A methodology is advanced for determining failure modes in propulsion systems while minimizing down time and providing reliable analyses. Immediate management actions are listed followed by guidelines detailing the methods for organizing a failure-analysis team, conducting a thorough investigation, and developing short- and long-term corrective actions. Control and visibility are considered two key attributes of the failure investigation, and the initial hardware inspection requires documentation and the prevention of evidence loss. An event time line should be delineated for the analysis, and the materials and data analyses are considered extensions of the time line. Failure scenarios and fabrication histories are also needed for the failure analysis of the propulsion system followed by recurrence control so that the program can be restored.

Biggs, R. E.

1992-07-01

64

Requirements Analysis in the Value Methodology  

SciTech Connect

The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

Conner, Alison Marie

2001-05-01

65

Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.  

PubMed

Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation. PMID:19819365

Doytchev, Doytchin E; Szwillus, Gerd

2009-11-01

66

Systems biology data analysis methodology in pharmacogenomics  

PubMed Central

Pharmacogenetics aims to elucidate the genetic factors underlying the individual’s response to pharmacotherapy. Coupled with the recent (and ongoing) progress in high-throughput genotyping, sequencing and other genomic technologies, pharmacogenetics is rapidly transforming into pharmacogenomics, while pursuing the primary goals of identifying and studying the genetic contribution to drug therapy response and adverse effects, and existing drug characterization and new drug discovery. Accomplishment of both of these goals hinges on gaining a better understanding of the underlying biological systems; however, reverse-engineering biological system models from the massive datasets generated by the large-scale genetic epidemiology studies presents a formidable data analysis challenge. In this article, we review the recent progress made in developing such data analysis methodology within the paradigm of systems biology research that broadly aims to gain a ‘holistic’, or ‘mechanistic’ understanding of biological systems by attempting to capture the entirety of interactions between the components (genetic and otherwise) of the system. PMID:21919609

Rodin, Andrei S; Gogoshin, Grigoriy; Boerwinkle, Eric

2012-01-01

67

Analysis of PWR RCS Injection Strategy During Severe Accident  

SciTech Connect

Reactor coolant system (RCS) injection is an important strategy for severe accident management of a pressurized water reactor (PWR) system. Maanshan is a typical Westinghouse PWR nuclear power plant (NPP) with large, dry containment. The severe accident management guideline (SAMG) of Maanshan NPP is developed based on the Westinghouse Owners Group (WOG) SAMG.The purpose of this work is to analyze the RCS injection strategy of PWR system in an overheated core condition. Power is assumed recovered as the vessel water level drops to the bottom of active fuel. The Modular Accident Analysis Program version 4.0.4 (MAAP4) code is chosen as a tool for analysis. A postulated station blackout sequence for Maanshan NPP is cited as a reference case for this analysis. The hot leg creep rupture occurs during the mitigation action with immediate injection after power recovery according to WOG SAMG, which is not desired. This phenomenon is not considered while developing the WOG SAMG. Two other RCS injection methods are analyzed by using MAAP4. The RCS injection strategy is modified in the Maanshan SAMG. These results can be applied for typical PWR NPPs.

Wang, S.-J. [Institute of Nuclear Energy Research, Taiwan (China); Chiang, K.-S. [Institute of Nuclear Energy Research, Taiwan (China); Chiang, S.-C. [Taiwan Power Company, Taiwan (China)

2004-05-15

68

Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

69

Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

70

Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs  

NASA Technical Reports Server (NTRS)

A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

Tuomela, C. H.; Brennan, M. F.

1980-01-01

71

NASA Accident Precursor Analysis Handbook, Version 1.0  

NASA Technical Reports Server (NTRS)

Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare.

Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

2011-01-01

72

Analysis of offsite Emergency Planning Zones (EPZs) for the Rocky Flats Plant. Phase 3, Sitewide spectrum-of-accidents and bounding EPZ analysis  

SciTech Connect

During Phase 3 of the EPZ project, a sitewide analysis will be performed applying a spectrum-of-accidents approach to both radiological and nonradiological hazardous materials release scenarios. This analysis will include the MCA but will be wider in scope and will produce options for the State of Colorado for establishing a bounding EPZ that is intended to more comprehensively update the interim, preliminary EPZ developed in Phase 2. EG&G will propose use of a hazards assessment methodology that is consistent with the DOE Emergency Management Guide for Hazards Assessments and other methods required by DOE orders. This will include hazards, accident, safety, and risk analyses. Using this methodology, EG&G will develop technical analyses for a spectrum of accidents. The analyses will show the potential effects from the spectrum of accidents on the offsite population together with identification of offsite vulnerable zones and areas of concern. These analyses will incorporate state-of-the-art technology for accident analysis, atmospheric plume dispersion modeling, consequence analysis, and the application of these evaluations to the general public population at risk. The analyses will treat both radiological and nonradiological hazardous materials and mixtures of both released accidentally to the atmosphere. DOE/RFO will submit these results to the State of Colorado for the State`s use in determining offsite emergency planning zones for the Rocky Flats Plant. In addition, the results will be used for internal Rocky Flats Plant emergency planning.

Petrocchi, A.J.; Zimmerman, G.A.

1994-03-14

73

Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis  

SciTech Connect

The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

Gilles Youinou; R. Sonat Sen

2013-09-01

74

UNE METHODE D'ESTIMATION DE LA PROBABILITE DES ACCIDENTS MAJEURS DE BARRAGES : LA METHODE DU NOEUD PAPILLON  

E-print Network

UNE METHODE D'ESTIMATION DE LA PROBABILITE DES ACCIDENTS MAJEURS DE BARRAGES : LA METHODE DU NOEUD PAPILLON A methodology to estimate probability of dams major accident: a bow tie approach Christophe accidents. INERIS assesses the probability of major accident through a methodology based on the analysis

Paris-Sud XI, Université de

75

Cost analysis methodology: Photovoltaic Manufacturing Technology Project  

NASA Astrophysics Data System (ADS)

This report describes work done under Phase 1 of the Photovoltaic Manufacturing Technology (PVMaT) Project. PVMaT is a five-year project to support the translation of research and development in PV technology into the marketplace. PVMaT, conceived as a DOE/industry partnership, seeks to advance PV manufacturing technologies, reduce PV module production costs, increase module performance, and expand US commercial production capacities. Under PVMaT, manufacturers will propose specific manufacturing process improvements that may contribute to the goals of the project, which is to lessen the cost, thus hastening entry into the larger scale, grid-connected applications. Phase 1 of the PVMaT project is to identify obstacles and problems associated with manufacturing processes. This report describes the cost analysis methodology required under Phase 1 that will allow subcontractors to be ranked and evaluated during Phase 2.

Whisnant, R. A.

1992-09-01

76

Cost analysis methodology: Photovoltaic Manufacturing Technology Project  

SciTech Connect

This report describes work done under Phase 1 of the Photovoltaic Manufacturing Technology (PVMaT) Project. PVMaT is a five-year project to support the translation of research and development in PV technology into the marketplace. PVMaT, conceived as a DOE/industry partnership, seeks to advanced PV manufacturing technologies, reduce PV module production costs, increase module performance, and expand US commercial production capacities. Under PVMaT, manufacturers will propose specific manufacturing process improvements that may contribute to the goals of the project, which is to lessen the cost, thus hastening entry into the larger scale, grid-connected applications. Phase 1 of the PVMaT project is to identify obstacles and problems associated with manufacturing processes. This report describes the cost analysis methodology required under Phase 1 that will allow subcontractors to be ranked and evaluated during Phase 2.

Whisnant, R.A. (Research Triangle Inst., Research Triangle Park, NC (United States))

1992-09-01

77

A DISCIPLINED APPROACH TO ACCIDENT ANALYSIS DEVELOPMENT AND CONTROL SELECTION  

SciTech Connect

The development and use of a Safety Input Review Committee (SIRC) process promotes consistent and disciplined Accident Analysis (AA) development to ensure that it accurately reflects facility design and operation; and that the credited controls are effective and implementable. Lessons learned from past efforts were reviewed and factored into the development of this new process. The implementation of the SIRC process has eliminated many of the problems previously encountered during Safety Basis (SB) document development. This process has been subsequently adopted for use by several Savannah River Site (SRS) facilities with similar results and expanded to support other analysis activities.

Ortner, T; Mukesh Gupta, M

2007-04-13

78

Analysis of Three Mile Island-Unit 2 accident  

SciTech Connect

The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

Not Available

1980-03-01

79

Combining task analysis and fault tree analysis for accident and incident analysis: A case study from Bulgaria  

Microsoft Academic Search

Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident\\/incident, while the latter

Doytchin E. Doytchev; Gerd Szwillus

2009-01-01

80

BESAFE II: Accident safety analysis code for MFE reactor designs  

NASA Astrophysics Data System (ADS)

The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications to BESAFE II is discussed in Chapter 6, for example, by adding additional environmental indices such as a waste disposal index. The biggest improvement to BESAFE II would be an increase in the database of activation product mobilization for a larger spectrum of fusion reactor materials. The ultimate goal we have is for BESAFE II to become part of a systems design program which would include economic factors and allow both safety and the cost of electricity to influence design.

Sevigny, Lawrence Michael

81

Human factors review for nuclear power plant severe accident sequence analysis  

Microsoft Academic Search

The paper discusses work conducted to: (1) support the severe accident sequence analysis of a nuclear power plant transient based on an assessment of operator actions, and (2) develop a descriptive model of operator severe accident management. Operator actions during the transient are assessed using qualitative and quantitative methods. A function-oriented accident management model provides a structure for developing technical

P. A. Krois; P. M. Haas

1985-01-01

82

ANALYSIS OF INDUSTRIAL ACCIDENTS OCCURED IN SMALL AND MEDIUM-SIZED FURNITURE MANUFACTURING FIRMS IN TURKEY  

Microsoft Academic Search

In this study, the industrial accidents that occurred in small-sized furniture manufacturing firms have been analyzed. We've had interviews with 311 workers in 175 workplaces. As a result of the statistical analysis of the obtained data, the factors affecting the industrial accidents have been stated. The necessary steps to prevent and minimize these accidents have been stated in relation to

Burhanettin UYSAL

2005-01-01

83

Human errors reliability analysis in coal mine accidents based on Gray Relational Theory  

Microsoft Academic Search

Human errors which have been affecting safety accidents are one of the main reasons in coal mine. So these Accidents can be pre vented and reduced through analyzing human errors affecting factors. This paper has made elaborate analysis of the relative affecting factors which cause human errors through applying the Gray Relational Theory in coal mine accidents. Based upon this

Jianyi Lan; Meiying Qiao

2010-01-01

84

Accident Analysis and Prevention, 2012 (49), pp 73-77 www.elsevier.com/locate/aap  

E-print Network

1 Accident Analysis and Prevention, 2012 (49), pp 73-77 www.elsevier.com/locate/aap doi:10.1016/j.aap.2011.07.013 Motorcyclists' speed and "looked-but-failed-to-see" accidents Nicolas Clabaux, Thierry of accidents in which a non-priority road user failed to give way to an approaching motorcyclist without seeing

Paris-Sud XI, Université de

85

Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny  

E-print Network

Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny , Jean-Etienne T. Dongmo NASA Langley Research Center, MS 161, Hampton, VA, 23681. The majority of fatal aircraft accidents that during the ten year period 1997-2006, 59% of fatal aircraft accidents were associated with Loss

Kwatny, Harry G.

86

MELCOR ACCIDENT ANALYSIS FOR ARIES-ACT Paul W. Humrickhouse, Brad J. Merrill  

E-print Network

MELCOR ACCIDENT ANALYSIS FOR ARIES-ACT Paul W. Humrickhouse, Brad J. Merrill Idaho National accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with Li consider here the implications of a loss of flow accident (LOFA) resulting from a loss of offsite power

87

AIR TRAFFIC CONTROL (ATC) RELATED ACCIDENTS AND INCIDENTS: A HUMAN FACTORS ANALYSIS  

Microsoft Academic Search

To date, the nature and role of ATC personnel in aviation accidents and incidents has yet to be fully examined. To remedy this situation, a comprehensive review of ATC-related accidents and incidents that occurred between January 1985 and December 1997 was conducted using records maintained by the NTSB. Results of the analysis revealed that ATC-related accidents and incidents are infrequent

Anthony M. Pape; Douglas A. Wiegmann; Scott Shappell

2001-01-01

88

Predicting System Accidents with Model Analysis During Hybrid Simulation  

NASA Technical Reports Server (NTRS)

Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

Malin, Jane T.; Fleming, Land D.; Throop, David R.

2002-01-01

89

Extension of ship accident analysis to multiple-package shipments  

SciTech Connect

Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10`s or 100`s of individual packagings is compromised. The previous analysis involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship`s bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence.

Mills, G.S.; Neuhauser, K.S.

1997-11-01

90

Analysis of physical data from the Chernobyl unit 4 accident  

Microsoft Academic Search

Physical data about the Chernobyl accident has been collected inside the Shelter, near the destroyed unit, and from fallout. This paper summarizes accident analyses conducted in 1986 (a short time after the April 26, 1986 accident), analyses conducted in 1995 and 1996 with the Ukrainian Academy of Sciences, analyses by Russia's MinAtom in 1996, and results of continuing analyses. “TECHOCENTRE”

Edward E. Purvis; Vladimir V. Tokarevsky; Y. V. Veryuzsky

1997-01-01

91

N. Bourdet, C. Deck, T. Serre, C. Perrin, M. Llari, R. Willinger, Methodology for a global bicycle real world accidents reconstruction, International Crashworthiness Conference, July 18-20, 2012, Politecnico Milano,  

E-print Network

real world accidents reconstruction, International Crashworthiness Conference, July 18-20, 2012, Politecnico ­ Milano, 2012-077 1 Methodology for a global bicycle real world accidents reconstruction N are available concerning the head impact loading in case of real accidents. Therefore, the objective

Boyer, Edmond

92

Estimating the causes of traffic accidents using logistic regression and discriminant analysis.  

PubMed

Factors that affect traffic accidents have been analysed in various ways. In this study, we use the methods of logistic regression and discriminant analysis to determine the damages due to injury and non-injury accidents in the Eskisehir Province. Data were obtained from the accident reports of the General Directorate of Security in Eskisehir; 2552 traffic accidents between January and December 2009 were investigated regarding whether they resulted in injury. According to the results, the effects of traffic accidents were reflected in the variables. These results provide a wealth of information that may aid future measures toward the prevention of undesired results. PMID:23837801

Karacasu, Murat; Ergül, Bar??; Altin Yavuz, Arzu

2014-12-01

93

LOSS OF COOLANT ACCIDENT AND LOSS OF FLOW ACCIDENT ANALYSIS OF THE ARIES-AT DESIGN E. A. Mogahed, L. El-Guebaly, A. Abdou, P. Wilson, D. Henderson and the ARIES Team  

E-print Network

LOSS OF COOLANT ACCIDENT AND LOSS OF FLOW ACCIDENT ANALYSIS OF THE ARIES-AT DESIGN E. A. Mogahed, L accident (LOCA) and loss of flow accident (LOFA) analysis is performed for ARIES-AT, an advanced fusion of steel in the reactor is about (600 °C - 700°C) after about 4 days from the onset of the accident

California at San Diego, University of

94

An analysis of evacuation options for nuclear accidents  

SciTech Connect

In this report we consider the threat posed by the accidental release of radionuclides from a nuclear power plant. The objective is to establish relationships between radiation dose and the cost of evacuation under a wide variety of conditions. The dose can almost always be reduced by evacuating the population from a larger area. However, extending the evacuation zone outward will cause evacuation costs to increase. The purpose of this analysis was to provide the Environmental Protection Agency (EPA) a data base for evaluating whether implementation costs and risks averted could be used to justify evacuation at lower doses. The procedures used and results of these analyses are being made available as background information for use by others. We develop cost/dose relationships for 54 scenarios that are based upon the severity of the reactor accident, meteorological conditions during the release of radionuclides into the environment, and the angular width of the evacuation zone. The 54 scenarios are derived from combinations of three accident severity levels, six meteorological conditions and evacuation zone widths of 70{degree}, 90{degree}, and 180{degree}.

Tawil, J.J.; Strenge, D.L.; Schultz, R.W. [Battelle Memorial Inst., Richland, WA (United States)

1987-11-01

95

An Accident Precursor Analysis Process Tailored for NASA Space Systems  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

2010-01-01

96

Analysis of Web Workloads Using the Bootstrap Methodology  

E-print Network

Analysis of Web Workloads Using the Bootstrap Methodology Johnson Lee William Miniscalco Meng Li W by extracting the essential characteristics of actual Web traffic. We have used the Bootstrap methodology workload generated by Web Polygraph. The analysis of response size included averages and percentiles

97

Systematic Analysis Methodology for Mobile Phone's Electrostatic Discharge Soft Failures  

Microsoft Academic Search

A systematic analysis methodology for mobile phone's electrostatic discharge (ESD) soft failures is proposed. The pro- posed analysis methodology consists of two parallel processes: one is the ESD simulation and the other is the ESD characterization of the mobile phone. The ESD simulation models that consist of the ESD generator, the ESD testing setup, and a mobile phone are also

Ki Hyuk Kim; Yongsup Kim

2011-01-01

98

Potential Threats from a Likely Nuclear Power Plant Accident: a Climatological Trajectory Analysis and Tracer Study  

Microsoft Academic Search

The legacy of Chernobyl is not the only nuclear accident likely to confront Turkish territory, which is not far from other\\u000a insecure power plants, especially the Metsamor. The main purpose of this study was to examine the possible impacts to Turkish\\u000a territory of a hypothetical accident at the Metsamor Nuclear Plant. The research was performed based on two different methodologies:

Tayfun Kindap; Ufuk Utku Turuncoglu; Shu-Hua Chen; Alper Unal; Mehmet Karaca

2009-01-01

99

Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K  

SciTech Connect

Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun [Korea Electric Power Research Institute (Korea, Republic of)

2004-10-15

100

Hypothetical accident conditions thermal analysis of the 5320 package  

SciTech Connect

An axisymmetric model of the 5320 package was created to perform hypothetical accident conditions (HAC) thermal calculations. The analyses assume the 5320 package contains 359 grams of plutonium-238 (203 Watts) in the form of an oxide powder at a minimum density of 2.4 g/cc or at a maximum density of 11.2 g/cc. The solution from a non-solar 100 F ambient steady-state analysis was used as the initial conditions for the fire transient. A 30 minute 1,475 F fire transient followed by cooling via natural convection and thermal radiation to a 100 F non-solar environment was analyzed to determine peak component temperatures and vessel pressures. The 5320 package was considered to be horizontally suspended within the fire during the entire transient.

Hensel, S.J.; Gromada, R.J.

1995-12-31

101

PRESENTATION OF WORK PERFORMED BY ESReDA "ACCIDENT ANALYSIS" WORKING  

E-print Network

PRESENTATION OF WORK PERFORMED BY ESReDA "ACCIDENT ANALYSIS" WORKING GROUP AND GOALS FOR THE FUTURE J.P. PINEAU, INERIS, FRANCE Summary The Accident Analysis working group was initiated in January 1993. Three meetings were organised with active participation of representatives of the European Joint

Paris-Sud XI, Université de

102

Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System  

SciTech Connect

Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

WILLIAMS, J.C.

2000-09-15

103

Improved Methodology Application for 12-Rad Analysis in a Shielded Facility at SRS  

SciTech Connect

The DOE Order 420.1 requires establishing 12-rad evacuation zone boundaries and installing Criticality Accident Alarm System (CAAS) per ANS-8.3 standard for facilities having a probability of criticality greater than 10-6 per year. The H-Canyon at the Savannah River Site (SRS) is one of the reprocessing facilities where SRS reactor fuels, research reactor fuels, and other fissile materials are processed and purified using a modified Purex process called H-Modified or HM Process. This paper discusses an improved methodology for 12-rad zone analysis and its implementation within this large shielded facility that has a large variety of criticality sources and scenarios.

Paul, P.

2003-01-31

104

Otorhinolaryngologic disorders and diving accidents: an analysis of 306 divers.  

PubMed

Diving is a very popular leisure activity with an increasing number of participants. As more than 80% of the diving related problems involve the head and neck region, every otorhinolaryngologist should be familiar with diving medical standards. We here present an analysis of more than 300 patients we have treated in the past four years. Between January 2002 and October 2005, 306 patients presented in our department with otorhinological disorders after diving, or after diving accidents. We collected the following data: name, sex, age, date of treatment, date of accident, diagnosis, special aspects of the diagnosis, number of dives, diving certification, whether and which surgery had been performed, history of acute diving accidents or follow up treatment, assessment of fitness to dive and special remarks. The study setting was a retrospective cohort study. The distribution of the disorders was as follows: 24 divers (8%) with external ear disorders, 140 divers (46%) with middle ear disorders, 56 divers (18%) with inner ear disorders, 53 divers (17%) with disorders of the nose and sinuses, 24 divers (8%) with decompression illness (DCI) and 9 divers (3%) who complained of various symptoms. Only 18% of the divers presented with acute disorders. The most common disorder (24%) was Eustachian tube dysfunction. Female divers were significantly more often affected. Chronic sinusitis was found to be associated with a significantly higher number of performed dives. Conservative treatment failed in 30% of the patients but sinus surgery relieved symptoms in all patients of this group. The middle ear is the main problem area for divers. Middle ear ventilation problems due to Eustachian tube dysfunction can be treated conservatively with excellent results whereas pathology of the tympanic membrane and ossicular chain often require surgery. More than four out of five patients visited our department to re-establish their fitness to dive. Although the treatment of acute diving-related disorders is an important field for the treatment of divers, the main need of divers seems to be assessment and recovery of their fitness to dive. PMID:17639445

Klingmann, Christoph; Praetorius, Mark; Baumann, Ingo; Plinkert, Peter K

2007-10-01

105

Threat to Norway from potential accidents at the Kola nuclear power plant. Climatological trajectory analysis and episode studies  

NASA Astrophysics Data System (ADS)

Following the experiences after the Chernobyl accident in 1986, Norwegian Authorities regard the effects from accidental releases at nuclear installations in neighboring countries to be among the greatest environmental threats in the coming years. One of these nuclear installations is the Kola Nuclear Power Plant (Kola NPP). The unsatisfactory safety at the Kola NPP has been of major concern and a `Norwegian Plan of Action for Nuclear Safety' has been worked out ( Ministry of Foreign Affairs, 1995. Plan of action for follow-up activities to Report no. 34 to Norwegian parliament (1993-1994)). As a response to this plan, DNMI has been involved in a project called: `Consequence Analysis of Potential Accidents at the Kola Nuclear Power Plant'. DNMI's part of the project consisted of analyzing the atmospheric transport and deposition pattern resulting from potential accidents at the Kola NPP. Results based on two different methodologies are presented in this paper. (1) Trajectory analysis as a tool for describing the air pollution transport pattern and screening of a large set of meteorological data for the selection of weather situations suitable for episode studies. (2) Episode studies using DNMI's dispersion model `Severe Nuclear Accident Program' (SNAP) for the selected episodes.

Saltbones, Jørgen; Foss, Anstein; Bartnicki, Jerzy

106

MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents  

SciTech Connect

The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

Foppe, T.L.; Peterson, V.L.

1993-10-01

107

Social Network Analysis in Human Resource Development: A New Methodology  

Microsoft Academic Search

Through an exhaustive review of the literature, this article looks at the applicability of social network analysis (SNA) in the field of humanresource development. The literature review revealed that a number of disciplines have adopted this unique methodology, which has assisted in the development of theory. SNA is a methodology for examining the structure among actors, groups, and organizations and

John-Paul Hatala

2006-01-01

108

300-Area accident analysis for Emergency Planning Zones  

SciTech Connect

The Department of Energy has requested SRL assistance in developing offsite Emergency Planning Zones (EPZs) for the Savannah River Plant, based on projected dose consequences of atmospheric releases of radioactivity from potential credible accidents in the SRP operating areas. This memorandum presents the assessment of the offsite doses via the plume exposure pathway from the 300-Area potential accidents. 8 refs., 3 tabs.

Pillinger, W.L.

1983-06-27

109

Tobit analysis of vehicle accident rates on interstate highways  

Microsoft Academic Search

There has been an abundance of research that has used Poisson models and its variants (negative binomial and zero-inflated models) to improve our understanding of the factors that affect accident frequencies on roadway segments. This study explores the application of an alternate method, tobit regression, by viewing vehicle accident rates directly (instead of frequencies) as a continuous variable that is

Panagiotis Ch. Anastasopoulos; Andrew P. Tarko; Fred L. Mannering

2008-01-01

110

ANALYSIS OF JCO CRITICALITY ACCIDENT FROM VIEWPOINT OF RISK MANAGEMENT  

Microsoft Academic Search

The uranium criticality incident at a JCO plant in 1999 was Japan's worst-ever nuclear accident (INES Level 4). Sixty-nine persons were exposed to radiation, and of those two died. SMM, JCO's holding company, paid out about 11 million dollars in compensation to people and companies in the affected area. The direct cause of the accident was very clear: use of

Satoshi Kurita

111

Probabilistic analysis of accident precursors in the nuclear industry.  

PubMed

Feedback of operating experience has always been an important issue in the nuclear industry. A probabilistic safety analysis (PSA) can be used as a tool to analyse how an operational event might have developed adversely in order to obtain a quantitative assessment of the safety significance of the event. This process is called PSA-based event analysis (PSAEA). A comprehensive set of PSAEA guidelines was developed by an international project. The main characteristics of this methodology are summarised. This approach to analyse incidents can be used to meet different objectives of utilities or nuclear regulators. The paper describes the main objectives and the experiences of the Belgian nuclear regulatory organisation AVN with the application of PSA-based event analysis. Some interesting aspects of the process of PSAEA are further developed and underlined. Several case studies are discussed and an overview of the obtained results is given. Finally, the interest of a broad and interactive forum on PSAEA is highlighted. PMID:15231351

Hulsmans, M; De Gelder, P

2004-07-26

112

DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)  

SciTech Connect

This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

Young, K. R.; Augustine, C.; Anderson, A.

2010-02-01

113

Protein MAS NMR methodology and structural analysis of protein assemblies  

E-print Network

Methodological developments and applications of solid-state magic-angle spinning nuclear magnetic resonance (MAS NMR) spectroscopy, with particular emphasis on the analysis of protein structure, are described in this thesis. ...

Bayro, Marvin J

2010-01-01

114

Risk analysis using a hybrid Bayesian-approximate reasoning methodology.  

SciTech Connect

Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

Bott, T. F. (Terrence F.); Eisenhawer, S. W. (Stephen W.)

2001-01-01

115

Analysis of Construction Accidents in Turkey and Responsible Parties  

PubMed Central

Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

GURCANLI, G. Emre; MUNGEN, Ugur

2013-01-01

116

Shuttle TPS thermal performance and analysis methodology  

NASA Technical Reports Server (NTRS)

Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

1983-01-01

117

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

M. P. Little; C. R. Muirhead; L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; F. T. Harper; S. C. Hora

1997-01-01

118

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

F. E. Haskin; F. T. Harper; L. H. J. Goossens; B. C. P. Kraan

1997-01-01

119

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. Boardman; J. A. Jones; F. T. Harper; M. L. Young; S. C. Hora

1997-01-01

120

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

F. E. Haskin; F. T. Harper; L. H. J. Goossens; B. C. P. Kraan; J. B. Grupa

1997-01-01

121

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. D. Harrison; F. T. Harper; S. C. Hora

1998-01-01

122

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

M. P. Little; C. R. Muirhead; L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; F. T. Harper; S. C. Hora

1997-01-01

123

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. D. Harrison; F. T. Harper; S. C. Hora

1998-01-01

124

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The

F. T. Harper; M. L. Young; L. A. Miller; S. C. Hora; C. H. Lui; L. H. J. Goossens; R. M. Cooke; J. Paesler-Sauer; J. C. Helton

1995-01-01

125

A formal and structured approach to the use of task analysis in accident modelling  

Microsoft Academic Search

Recent work (Telford & Johnson, 1996; Johnson, 1997), involving the application of formal notations to analyse accident reports has shown that the quality of these accident reports is poor, so much so that their conclusions can be misleading. The proposed solution has been to use formal notations in combination with traditional analysis to produce a report, the conclusions of which

R. M. Botting; Chris W. Johnson

1998-01-01

126

Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design  

Microsoft Academic Search

Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity

S. Reyes; J. F. Latkowski; J. Gomez del Rio; J. Sanz

2001-01-01

127

Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA  

SciTech Connect

A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained.

O'Kula, K.R.; Sharp, D.A. (Westinghouse Savannah River Co., Aiken, SC (United States)); Amos, C.N.; Wagner, K.C.; Bradley, D.R. (Science Applications International Corp., Albuquerque, NM (United States))

1992-01-01

128

BNL severe accident sequence experiments and analysis program  

SciTech Connect

A major source of containment pressurization during severe accidents is the transfer of stored energy from the hot core material to available cooling water. One mode of thermal interaction involves the quench of superheated beds of debris which could be present in the reactor cavity following melt-through or failure of the reactor vessel. This work supports development of models of superheated bed quench phenomena which are to be incorporated into containment analysis computer codes such as MARCH, CONTAIN, and MEDICI. A program directed towards characterization of the behavior of superheated debris beds has been completed. This work addressed the quench of superheated debris which is postulated to exist in the reactor cavity of a PWR following melt ejection from the primary system. The debris is assumed to be cooled by a pool of water overlying the bed of hot debris. This work has led to the development of models to predict rate of steam generation during the quench process and, in addition, the ability to assess the coolability of the debris during the transient quench process. A final report on this work has been completed. This report presents a brief description of some relevant results and conclusions. 15 refs.

Greene, G.A.; Ginsberg, T.; Tutu, N.K.

1985-01-01

129

Accident investigation: Analysis of aircraft motions from ATC radar recordings  

NASA Technical Reports Server (NTRS)

A technique was developed for deriving time histories of an aircraft's motion from air traffic control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data (from an onboard Mode-C transponder), to derive an expanded set of data which includes airspeed, lift, thrust-drag, attitude angles (pitch, roll, and heading), etc. This method of analyzing aircraft motions was evaluated through flight experiments which used the CV-990 research aircraft and recordings from both the enroute and terminal ATC radar systems. The results indicate that the values derived from the ATC radar records are for the most part in good agreement with the corresponding values obtained from airborne measurements. In an actual accident, this analysis of ATC radar records can complement the flight-data recorders, now onboard airliners, and provide a source of recorded information for other types of aircraft that are equipped with Mode-C transponders but not with onboard recorders.

Wingrove, R. C.

1976-01-01

130

Analysis of a hypothetical criticality accident in a waste supercompactor  

Microsoft Academic Search

A hypothetical nuclear criticality accident in a waste supercompactor is evaluated. The waste consists of a homogenous mixture of plutonium 49, beryllium, and air contained in a 35 gallon carbon steel drum. Possible consequences are investigated.

M. J. Plaster; B. Basoglu; C. L. Bentley; M. E. Dunn; A. E. Ruggles; A. Wilkinson; T. Yamamoto; H. L. Dodds

1994-01-01

131

Modeling control room crews for accident sequence analysis  

E-print Network

This report describes a systems-based operating crew model designed to simulate the behavior of an nuclear power plant control room crew during an accident scenario. This model can lead to an improved treatment of potential ...

Huang, Y. (Yuhao)

1991-01-01

132

Rail transportation risk and accident severity: A statistical analysis of variables in FRA's accident/incident data base  

SciTech Connect

The Federal Railroad Administration (US DOT) maintains a file of carrier-reported railroad accidents and incidents that meet stipulated threshold criteria for damage cost and/or casualties. A thoroughly-cleaned five-year time series of this data base was subjected to unbiased statistical procedures to discover (a) important causative variables in severe (high damage cost) accidents and (b) other key relationships between objective accident conditions and frequencies. Just under 6000 records, each representing a single event involving rail freight shipments moving on mainline track, were subjected to statistical frequency analysis, then included in the construction of classification and regression trees as described by Breimann et al. (1984). Variables related to damage cost defined the initial splits,'' or branchings of the tree. An interesting implication of the results of this analysis with respect to transportation of hazardous wastes by rail is that movements should be avoided when ambient temperatures are extreme (significantly < 20{degrees} or > 80{degrees}F), but that there should be no a priori bias against shipping wastes in longer train consists. 2 refs., 2 figs., 12 tabs.

Saricks, C.L. (Argonne National Lab., IL (USA). Energy Systems Div.); Janssen, I. (Argonne National Lab., IL (USA). Biological and Medical Research Div.)

1991-01-01

133

GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES.  

SciTech Connect

The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i.e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. [13], which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

Kamm, J. R. (James R.); Rider, William; Rightley, P. M. (Paul M.); Prestridge, K. P. (Katherine P.); Benjamin, R. F. (Robert F.); Vorobieff, P. V. (Peter V.)

2001-01-01

134

GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES  

SciTech Connect

The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i. e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. (13), which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

J. R. KAMM; ET AL

2001-01-01

135

Offsite Radiological Consequence Analysis for the Bounding Flammable Gas Accident  

SciTech Connect

This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a single-shell tank The calculation applies reasonably conservation input parameters in accordance with DOE-STD-3009, Appendix A, guidance. Revision 1 incorporates comments received from Office of River Protection.

CARRO, C.A.

2003-07-30

136

Traffic Accident Analysis Using Decision Trees and Neural Networks  

Microsoft Academic Search

The costs of fatalities and injuries due to traffic accident have a great\\u000aimpact on society. This paper presents our research to model the severity of\\u000ainjury resulting from traffic accidents using artificial neural networks and\\u000adecision trees. We have applied them to an actual data set obtained from the\\u000aNational Automotive Sampling System (NASS) General Estimates System (GES).\\u000aExperiment

Miao M. Chong; Ajith Abraham; Marcin Paprzycki

2004-01-01

137

RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

138

Radiochemical Analysis Methodology for uranium Depletion Measurements  

SciTech Connect

This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

Scatena-Wachel DE

2007-01-09

139

Improving Network Reliability: Analysis, Methodology, and Algorithms  

E-print Network

multicast network along with a technique that enables wireless clients to efficiently recover lost data sent by their source through collaborative information exchange. Analysis of a network's reliability during a natural disaster can be assessed...

Booker, Graham B.

2010-07-14

140

Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company  

PubMed Central

Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

2014-01-01

141

Risks due to beyond design base accidents of nuclear power plants in Europe—the methodology of riskmap  

Microsoft Academic Search

International treaties on liability in the case of nuclear accidents set a limit on the repair payments to be made by the operators of nuclear power plants to countries adversely affected by nuclear fall-out which is independent of the actual risk incurred by the individual countries. A map of the risk due to beyond design base accidents of nuclear power

Iouli Andreev; Markus Hittenberger; Peter Hofer; Helga Kromp-Kolb; Wolfgang Kromp; Petra Seibert; Gerhard Wotawa

1998-01-01

142

Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives  

SciTech Connect

This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.

Kohout, E.F.; Folga, S.; Mueller, C.; Nabelssi, B.

1996-03-01

143

METHODOLOGICAL NOTES: Multifractal analysis of complex signals  

NASA Astrophysics Data System (ADS)

This paper presents the foundations of the continuous wavelet-transform-based multifractal analysis theory and the information necessary for its practical application. It explains generalizations of a multifractal concept to irregular functions, better known as the method of wavelet transform modulus maxima; it investigates the benefits and limitations of this technique in the analysis of complex signals; and it discusses the efficiency of the multifractal formalism in the investigation of nonstationary processes and short signals. The paper also considers the effects of the loss of multifractality in the dynamics of various systems.

Pavlov, Aleksei N.; Anishchenko, Vadim S.

2007-08-01

144

Audience analysis solution using soft computing methodologies  

Microsoft Academic Search

In this paper, we propose a pedestrian analysis solution helpful for adaptive content delivery and interest measurement for outdoor advertisement displays. The proposed system has built-in camera on the top panel of such displays which capture the real time viewers' frames. The captured frames have been analyzed for detection of faces using Viola-Jones algorithm. The detected faces have been processed

Rajen Bhatt; Ghulam Mohiuddin Khan; Sujith Kumar; Durga Ganesh Grandhi

2010-01-01

145

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)  

E-print Network

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada, Hampton, VA 23681-2199, USA c.m.holloway@nasa.gov Abstract Accident reports provide important insights an analysis that extends across the findings presented over ten years investigations into maritime accidents

Williamson, John

146

Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis  

NASA Technical Reports Server (NTRS)

NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

2005-01-01

147

Action Plan for updated Chapter 15 Accident Analysis in the SRS Production Reactor SAR  

SciTech Connect

This report describes the Action Plan for the upgrade of the Chapter 15 Accident Analysis in the SRS Production Reactor SAR required for K-Restart. This Action Plan will be updated periodically to reflect task accomplishments and issue resolutions.

Hightower, N.T. III; Burnett, T.W.

1989-11-15

148

A methodology for probabilistic fault displacement hazard analysis (PFDHA)  

USGS Publications Warehouse

We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

Youngs, R. R.; Arabasz, W. J.; Anderson, R. E.; Ramelli, A. R.; Ake, J. P.; Slemmons, D. B.; McCalpin, J. P.; Doser, D. I.; Fridrich, C. J.; Swan, III, F. H.; Rogers, A. M.; Yount, J. C.; Anderson, L. W.; Smith, K. D.; Bruhn, R. L.; Knuepfer, P. L. K.; Smith, R. B.; DePolo, C. M.; O'Leary, D. W.; Coppersmith, K. J.; Pezzopane, S. K.; Schwartz, D. P.; Whitney, J. W.; Olig, S. S.; Toro, G. R.

2003-01-01

149

Astrolabe: A Collaborative Multiperspective Goal-Oriented Risk Analysis Methodology  

Microsoft Academic Search

Abstract The intention of this paper is to introduce a risk analysis methodology,called Astrolabe. Astrolabe is based on causal analysis of systems risks. It allows the analysts to both align the current standpoint of the system with its intentions and identify any vulnerabilities or hazards that threaten the systems stability. Astrolabe adopts concepts from organizational theory and software requirement engineering.

Ebrahim Bagheri; Ali A. Ghorbani

2009-01-01

150

A space-time processing and spectral analysis methodology  

E-print Network

16 Chapter 2 A space-time processing and spectral analysis methodology 2.1 Overview The traditional achieved by studying the space-time properties of tracer fields on geographically localised grids. In this chapter space-time processing and spectral analysis techniques used extensively in later chapters

Finlay, Christopher

151

On the application of syntactic methodologies in automatic text analysis  

Microsoft Academic Search

This study summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Included are standard syntactic methods to generate complex content identifiers, and the use of semantic know-how obtained from machine-readable dictionaries and from specially constructed knowledge bases. A particular syntactic analysis methodology is also outlined and its usefulness for the automatic construction of book indexes is examined.

Gerard Salton; Maria Smith

1989-01-01

152

Human factors data traceability and analysis in the European Community's Major Accident Reporting System  

Microsoft Academic Search

This paper reports on human factors data traceability and analysis of the European Community’s Major Accident Reporting System\\u000a (MARS). This is the main EU instrument to major accident data collection, analysis and dissemination for process industry\\u000a according to the provisions of the Seveso II Directive. To date, the MARS database counts approximately 700 Seveso-type major\\u000a events (November 2008). The MARS

D. Baranzini; M. D. Christou

2010-01-01

153

Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence  

NASA Technical Reports Server (NTRS)

Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

2004-01-01

154

NMR methodologies in the analysis of blueberries.  

PubMed

An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-?-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering. PMID:24668393

Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

2014-06-01

155

ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS  

SciTech Connect

This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

WILLIAMS, J.C.

2003-11-15

156

CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL  

SciTech Connect

Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask-loading-specific conditions could be performed to demonstrate that release is within the allowable leak rates of the cask.

Vinson, D.

2010-07-11

157

Advanced Power Plant Development and Analysis Methodologies  

SciTech Connect

Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

2006-06-30

158

Extension of ship accident analysis to multiple-package shipments  

Microsoft Academic Search

Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO

G. S. Mills; K. S. Neuhauser

1997-01-01

159

Accidents Analysis of Rail Transportation Industry in Iran  

Microsoft Academic Search

The main idea of this research is to understand and evaluate the working conditions subjected to the rolling stockers of the Islamic republic of Iran railway to reduce the accidents and expenses in order to increase the productivity of the Iranian railway Industry. The problems due to productivity, job satisfaction and work safety have direct relation with the working design.

Iraj Koohi

160

Analysis of electrical accidents in UK domestic properties  

Microsoft Academic Search

Electricity is one of the most convenient forms of energy that is used in every building today. However, it causes a number of fatalities every year and has the potential to cause harm to anyone exposed to it. This article, investigates the cause and effects of electrical accidents in domestic properties over a 3-year period (2000—2002) in the UK based

M. Barrett; K. OConnell; Cma Sung; G. Stokes

2010-01-01

161

INVITED EDITORIAL: Uncertainties in probabilistic nuclear accident consequence analysis  

Microsoft Academic Search

National Radiological Protection Board, Chilton, Didcot, Oxon OX11 0RQ, UK For all nuclear installations there is a small probability of an accident occurring which could lead to a release of radionuclides into the environment, despite the design intent to build the nuclear plant in such a way as to reduce that possibility to a low level. It is therefore important

M. P. Little

1998-01-01

162

Weighted Correlation Network Analysis: Recent Methodological Advances and Applications  

Microsoft Academic Search

ABSTRACT: I will review recent methodological advances, data analysis strategies, and applications by my collaborators regarding,the use of weighted correlation network,analysis (WGCNA) also known,as Weighted,Gene Co-expression Network Analysis. WGCNA is a network-based, systems biologic data reduction method for high dimensional data. It has mainly been applied to gene expression data but alternative applications (e.g. to brain imaging data, proteomics,data) are

Steve Horvath

163

Analysis of traffic accidents on rural highways using Latent Class Clustering and Bayesian Networks.  

PubMed

One of the principal objectives of traffic accident analyses is to identify key factors that affect the severity of an accident. However, with the presence of heterogeneity in the raw data used, the analysis of traffic accidents becomes difficult. In this paper, Latent Class Cluster (LCC) is used as a preliminary tool for segmentation of 3229 accidents on rural highways in Granada (Spain) between 2005 and 2008. Next, Bayesian Networks (BNs) are used to identify the main factors involved in accident severity for both, the entire database (EDB) and the clusters previously obtained by LCC. The results of these cluster-based analyses are compared with the results of a full-data analysis. The results show that the combined use of both techniques is very interesting as it reveals further information that would not have been obtained without prior segmentation of the data. BN inference is used to obtain the variables that best identify accidents with killed or seriously injured. Accident type and sight distance have been identify in all the cases analysed; other variables such as time, occupant involved or age are identified in EDB and only in one cluster; whereas variables vehicles involved, number of injuries, atmospheric factors, pavement markings and pavement width are identified only in one cluster. PMID:23182777

de Oña, Juan; López, Griselda; Mujalli, Randa; Calvo, Francisco J

2013-03-01

164

The Influence of Seasonal Characteristics on the Accident Consequences Analysis  

SciTech Connect

In order to examine the influence of seasonal characteristics on accident consequences, we defined a limited number of basic spectra based on the relative importance of source term release parameters and meteorological conditions on offsite health effects and economic impacts. We then investigated the variation in numbers and frequency of early health effects and economic impacts resulting from the severe accidents of the YGN 3 and 4 nuclear power plants from spectrum to spectrum by using MACCS code. These investigations were for meteorological conditions defined as typical on an annual basis. Also, we investigated the variation in numbers and frequency of early health effects and economic impacts for the same standard spectra for meteorological conditions defined as typical on a seasonal basis recognizing that there are four seasons with distinct meteorological characteristics. Results show that there are large differences in consequences from spectrum to spectrum, although an equal amount and mix of radioactive material is released to the atmosphere in each case. Therefore, release parameters and meteorological data have to be well characterized in order to estimate accident consequences resulting from an accident accurately. Also, there are large differences in the estimated number of health effects and economic impacts from season to season due to distinct seasonal variations in meteorological conditions in Korea. In fall, the early fatalities and early fatality risk show minimum values due to enhanced dispersion arising from increased atmospheric instability, and the early fatalities show maximum value in summer due to a large rainfall rate. On the contrast, the economic cost shows maximum value in fall and minimum in summer due to different atmospheric dispersion and rainfall rate. Therefore, it is necessary to consider seasonal characteristics in developing emergency response strategies for reducing offsite early health risks in the event of a severe accident. (authors)

Jongtae Jeong; Wondea Jung [Korea Atomic Energy Research Institute, 150, Dukjin-Dong, Yusong-Gu, Taejon 305-353 (Korea, Republic of)

2002-07-01

165

A Global Sensitivity Analysis Methodology for Multi-physics Applications  

SciTech Connect

Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

Tong, C H; Graziani, F R

2007-02-02

166

Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion  

SciTech Connect

Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

2012-09-30

167

Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology  

ERIC Educational Resources Information Center

The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

2004-01-01

168

A User Friendly Phase Detection Methodology for HPC Systems' Analysis  

E-print Network

A User Friendly Phase Detection Methodology for HPC Systems' Analysis Ghislain Landry Tsafack, stolf, dacosta}@irit.fr Abstract--A wide array of today's high performance com- puting (HPC for detecting phases in the behaviour of a HPC system and deter- mining execution points that correspond

Paris-Sud XI, Université de

169

NOTE / NOTE A methodological analysis of canopy disturbance  

E-print Network

NOTE / NOTE A methodological analysis of canopy disturbance reconstructions using Quercus alba analyzed paired tree-ring series from 884 Quercus alba L. individuals to quantify discrepant patterns les sé- ries dendrochronologies appariées de 884 tiges de Quercus alba L. pour quantifier les patrons

Hart, Justin

170

Analysis of general-aviation accidents using ATC radar records  

NASA Technical Reports Server (NTRS)

It is pointed out that general aviation aircraft usually do not carry flight recorders, and in accident investigations the only available data may come from the Air Traffic Control (ATC) records. A description is presented of a technique for deriving time-histories of aircraft motions from ATC radar records. The employed procedure involves a smoothing of the raw radar data. The smoothed results, in combination with other available information (meteorological data and aircraft aerodynamic data) are used to derive the expanded set of motion time-histories. Applications of the considered analytical methods are related to different types of aircraft, such as light piston-props, executive jets, and commuter turboprops, as well as different accident situations, such as takeoff, climb-out, icing, and deep stall.

Wingrove, R. C.; Bach, R. E., Jr.

1982-01-01

171

A methodology to assess possible effects of enhanced surveillance on the risk estimate from ecologic studies of thyroid cancer after the Chernobyl accident  

Microsoft Academic Search

Summary For two reasons quantitative post-Chernobyl assessment of the thyroid cancer risk from 131I exposure in children under age 18 at the time of accident has been carried out with spatially aggregated data. Firstly, individual data has not been available up to now. Secondly, a large number of individuals can be included in the analysis for a better statistical power.

J. Christian Kaiser; Peter Jacob; Sergey Vavilov

172

Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model  

Microsoft Academic Search

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work

J. C. Helton; J. D. Johnson; J. A. Rollstin; A. W. Shiver; J. L. Sprung

1995-01-01

173

Uncertainty and sensitivity analysis of early exposure results with the MACCS reactor accident consequence model  

Microsoft Academic Search

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review

J. C. Helton; J. D. Johnson; A. W. Shiver; J. L. Sprung

1995-01-01

174

Uncertainty and sensitivity analysis of food pathway results with the MACCS reactor accident consequence model  

Microsoft Academic Search

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work

J. C. Helton; J. D. Johnson; J. A. Rollstin; A. W. Shiver; J. L. Sprung

1995-01-01

175

Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model  

Microsoft Academic Search

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review

J. C. Helton; J. D. Johnson; M. D. McKay; A. W. Shiver; J. L. Sprung

1995-01-01

176

Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model  

Microsoft Academic Search

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review

J. C. Helton; J. D. Johnson; J. A. Rollstin; A. W. Shiver; J. L. Sprung

1995-01-01

177

Hoof kick injuries in unmounted equestrians. Improving accident analysis and prevention by introducing an accident and emergency based relational database  

PubMed Central

Methods: Data analysis using a new kind of full electronic medical record. Results: Seventeen kicked equestrians were unmounted at the time of injury. Eight of seventeen patients sustained contusions of the extremities, the back, and the trunk. In nine patients an isolated facial injury was diagnosed. Five of nine patients needed referrals to the department of plastic surgery because of the complexity of the facial soft tissue wounds. Three underwent maxillofacial surgery. Conclusion: Clinical: the equestrian community may underestimate the risk of severe injuries attribtuable to hoof kicks, especially while handling the horse. Educational lectures and the distribution of educational literature should be promoted. The introduction of additional face shields may be protective. Software related issue: the handling of an increasing amount of medical data makes a development in computerisation of emergency units necessary. Thus the increasing utilisation of new computer technology could have a significant influence on accident analysis and prevention and the quality of research in the future. PMID:12421795

Exadaktylos, A; Eggli, S; Inden, P; Zimmermann, H

2002-01-01

178

THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT  

SciTech Connect

Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

Gupta, N.

2011-02-14

179

Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory  

SciTech Connect

A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

Kim, S.H.; Taleyarkhan, R.P.

1994-01-01

180

Integrated Analysis of Mechanical and Thermal Hydraulic Behavior of Graphite Stack in Channel-Type Reactors in Case of a Fuel Channel Rupture Accident  

SciTech Connect

The paper discusses the methodology and a computational exercise analyzing the processes taking place in the graphite stack of an RBMK reactor in case of a pressure tube rupture caused by overheating. The methodology of the computational analysis is implemented in integrated code U{sub S}TACK which models thermal-hydraulic and mechanical processes in the stack with a varying geometry, coupled with the processes going on in the circulation loop and accident localization (confinement) system. Coolant parameters, cladding and pressure tube temperatures, pressure tube ballooning and rupture, coolant outflow are calculated for a given accident scenario. Fluid parameters, movement of graphite blocks and adjacent pressure tubes bending after the tube rupture are calculated for the whole volume of the core. Calculations also cover additional loads on adjacent fuel channels in the rupture zone, reactor shell, upper and lower plates. Impossibility of an induced pressure tube rupture is confirmed. (authors)

Soloviev, Sergei L. [MINATOM, Moscow (Russian Federation); Gabaraev, Boris A.; Novoselsky, Oleg Yu.; Filinov, Vladimir N. [Research and Development Institute of Power Engineering, M. Krasnoselskaya ul., build. 2/8, 107140 Moscow (Russian Federation); Parafilo, Leonid M.; Kruchkov, Dmitry V. [Institute of Physics and Power Engineering, 1 Bondarenko sq., RU-249020 Obninsk Kaluga Region (Russian Federation); Melikhov, Oleg I. [Electrogorsk Research and Engineering Center, Saint Constantine st., 6, Electrogorsk, Moscow Region, 142530 (Russian Federation)

2002-07-01

181

Methodological Variability Using Electronic Nose Technology For Headspace Analysis  

SciTech Connect

Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

Knobloch, Henri; Turner, Claire; Spooner, Andrew [Cranfield University, Cranfield Health, Silsoe (United Kingdom); Chambers, Mark [Veterinary Laboratories Agency (VLA Weybridge) (United Kingdom)

2009-05-23

182

Time-independent neutronic analysis of the Chernobyl accident  

SciTech Connect

Estimates are made of the positive reactivity introduced through the growth of the coolant void fraction in the Chernobyl reactor at both the average burnup value given by the Soviets and the maximum value. Using Monte Carlo models, various possible axial burnup distributions, displacer models, conditions in the control channels, and control rod positions are considered in calculating the insertion of positive reactivity by the manual and emergency control rods, that is, the positive scram. Two possible scenarios are examined for a second reactivity peak: (a) creation of a mixture of fuel, water, and cladding in a number of central fuel channels, resulting in the explosion of these channels, and (b) uniform vaporization throughout the entire reactor, resulting in reactor depressurization. From the data presented in this paper, it can be concluded that vaporization of the cooling water in the fuel channel gave the highest reactivity contribution to the Chernobyl accident. The positive reactivity due to insertion of the manual and emergency control rods played only a minor role in the reactivity balance of the accident.

Landeyro, P.A. (ENEA, CRE Casaccia, Via Anguillarese No. 301, 00100 Rome (IT)); Buccafurni, A. (ENEA, Nuclear Safety and Health Protection, Via Brancati 48, 00144 Rome (IT))

1991-06-01

183

Injury patterns of seniors in traffic accidents: A technical and medical analysis  

PubMed Central

AIM: To investigate the actual injury situation of seniors in traffic accidents and to evaluate the different injury patterns. METHODS: Injury data, environmental circumstances and crash circumstances of accidents were collected shortly after the accident event at the scene. With these data, a technical and medical analysis was performed, including Injury Severity Score, Abbreviated Injury Scale and Maximum Abbreviated Injury Scale. The method of data collection is named the German In-Depth Accident Study and can be seen as representative. RESULTS: A total of 4430 injured seniors in traffic accidents were evaluated. The incidence of sustaining severe injuries to extremities, head and maxillofacial region was significantly higher in the group of elderly people compared to a younger age (P < 0.05). The number of accident-related injuries was higher in the group of seniors compared to other groups. CONCLUSION: Seniors are more likely to be involved in traffic injuries and to sustain serious to severe injuries compared to other groups. PMID:23173111

Brand, Stephan; Otte, Dietmar; Mueller, Christian Walter; Petri, Maximilian; Haas, Philipp; Stuebig, Timo; Krettek, Christian; Haasper, Carl

2012-01-01

184

[Comparative analysis of the radionuclide composition in fallout after the Chernobyl and the Fukushima accidents].  

PubMed

The nuclear accident occurred at Fukushima Dai-ichi Nuclear Power Plant (NPP) (March 11, 2011) similarly to the accident at the Chernobyl NPP (April 26, 1986) is related to the level 7 of the INES. It is of interest to make an analysis of the radionuclide composition of the fallout following the both accidents. The results of the spectrometric measurements were used in that comparative analysis. Two areas following the Chernobyl accident were considered: (1) the near zone of the fallout - the Belarusian part of the central spot extended up to 60 km around the Chernobyl NPS and (2) the far zone of the fallout--the "Gomel-Mogilev" spot centered 200 km to the north-northeast of the damaged reactor. In the case of Fukushima accident the near zone up to about 60 km considered. The comparative analysis has been done with respect to refractory radionuclides (95Zr, 95Nb, 141Ce, 144Ce), as well as to the intermediate and volatile radionuclides 103Ru, 106Ru, 131I, 134Cs, 137Cs, 140La, 140Ba and the results of such a comparison have been discussed. With respect to exposure to the public the most important radionuclides are 131I and 137Cs. For the both accidents the ratios of 131I/137Cs in the considered soil samples are in the similar ranges: (3-50) for the Chernobyl samples and (5-70) for the Fukushima samples. Similarly to the Chernobyl accident a clear tendency that the ratio of 131I/137Cs in the fallout decreases with the increase of the ground deposition density of 137Cs within the trace related to a radioactive cloud has been identified for the Fukushima accident. It looks like this is a universal tendency for the ratio of 131I/137Cs versus the 137Cs ground deposition density in the fallout along the trace of a radioactive cloud as a result of a heavy accident at the NPP with radionuclides releases into the environment. This tendency is important for an objective reconstruction of 131I fallout based on the results of 137Cs measurements of soil samples carried out at late dates after the Fukushima accident. PMID:23210176

Kotenko, K V; Shinkarev, S M; Abramov, Iu V; Granovskaia, E O; Iatsenko, V N; Gavrilin, Iu I; Margulis, U Ia; Garetskaia, O S; Imanaka, T; Khoshi, M

2012-01-01

185

Application of Latin hypercube sampling to RADTRAN 4 truck accident risk sensitivity analysis  

SciTech Connect

The sensitivity of calculated dose estimates to various RADTRAN 4 inputs is an available output for incident-free analysis because the defining equations are linear and sensitivity to each variable can be calculated in closed mathematical form. However, the necessary linearity is not characteristic of the equations used in calculation of accident dose risk, making a similar tabulation of sensitivity for RADTRAN 4 accident analysis impossible. Therefore, a study of sensitivity of accident risk results to variation of input parameters was performed using representative routes, isotopic inventories, and packagings. It was determined that, of the approximately two dozen RADTRAN 4 input parameters pertinent to accident analysis, only a subset of five or six has significant influence on typical analyses or is subject to random uncertainties. These five or six variables were selected as candidates for Latin Hypercube Sampling applications. To make the effect of input uncertainties on calculated accident risk more explicit, distributions and limits were determined for two variables which had approximately proportional effects on calculated doses: Pasquill Category probability (PSPROB) and link population density (LPOPD). These distributions and limits were used as input parameters to Sandia`s Latin Hypercube Sampling code to generate 50 sets of RADTRAN 4 input parameters used together with point estimates of other necessary inputs to calculate 50 observations of estimated accident dose risk.Tabulations of the RADTRAN 4 accident risk input variables and their influence on output plus illustrative examples of the LHS calculations, for truck transport situations that are typical of past experience, will be presented .

Mills, G.S.; Neuhauser, K.S.; Kanipe, F.L.

1994-12-31

186

Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

1996-12-01

187

An Efficient Analysis Methodology for Fluted-Core Composite Structures  

NASA Technical Reports Server (NTRS)

The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

Oremont, Leonard; Schultz, Marc R.

2012-01-01

188

RELAP5 Application to Accident Analysis of the NIST Research Reactor  

SciTech Connect

Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

2012-03-18

189

Accident analysis for transuranic waste management alternatives in the U.S. Department of Energy waste management program  

SciTech Connect

Preliminary accident analyses and radiological source term evaluations have been conducted for transuranic waste (TRUW) as part of the US Department of Energy (DOE) effort to manage storage, treatment, and disposal of radioactive wastes at its various sites. The approach to assessing radiological releases from facility accidents was developed in support of the Office of Environmental Management Programmatic Environmental Impact Statement (EM PEIS). The methodology developed in this work is in accordance with the latest DOE guidelines, which consider the spectrum of possible accident scenarios in the implementation of various actions evaluated in an EIS. The radiological releases from potential risk-dominant accidents in storage and treatment facilities considered in the EM PEIS TRUW alternatives are described in this paper. The results show that significant releases can be predicted for only the most severe and extremely improbable accidents sequences.

Nabelssi, B.; Mueller, C.; Roglans-Ribas, J.; Folga, S.; Tompkins, M. [Argonne National Lab., IL (United States); Jackson, R. [Scientific Applications International Corp., Golden, CO (United States)

1995-03-01

190

Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components  

NASA Technical Reports Server (NTRS)

The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

2000-01-01

191

Towards a Methodology for Identifying Program Constraints During Requirements Analysis  

NASA Technical Reports Server (NTRS)

Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

1997-01-01

192

Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept  

SciTech Connect

The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

Ricci, E.; McLean, R.B.

1988-09-01

193

Vapor explosions: A review of experiments for accident analysis  

SciTech Connect

A vapor explosion is a physical event in which a hot liquid (fuel) transfers its internal energy to a colder, more volatile liquid (coolant); thus the coolant vaporizes at high pressures and expands analyses work on its surroundings. In postulated severe accidents in current fission reactors, vapor explosions are considered if this molten {open_quotes}fuel{close_quotes} contacts residual water in-vessel or ex-vessel because these physical explosions have the potential to contribute to reactor vessel failure and possibly containment failure and release of radioactive fission products. Current safety analyses and probabilistic studies consider this process with the use of explosion models. Eventually these models must be compared with available experimental data to determine their validity. This study provides a comprehensive review of vapor explosion experiments for eventual use in such comparisons. Also, when there are insufficient data, experiments are suggested that can provide the needed information for future comparisons. This view may be useful for light-water-reactor as well as noncommercial reactor safety studies. 115 refs., 6 figs., 3 tabs.

Corradini, M.L.; Taleyarkhan, R.P.

1991-07-01

194

LOCA and Air Ingress Accident Analysis of a Pebble Bed Reactor  

E-print Network

that approximately 6 m/s of air-flow would be required. This study also showed that the peak fuel temperature the buoyancy and resistance to flow in a pebble was used to predict the peak fuel temperatures and air ingress1 LOCA and Air Ingress Accident Analysis of a Pebble Bed Reactor by Tieliang Zhai Submitted

195

Natural Language Processing (NLP) tools for the analysis of incident and accident reports  

E-print Network

Natural Language Processing (NLP) tools for the analysis of incident and accident reports project, we use NLP methods to facilitate experience feedback in the field of civil aviation safety. In this paper, we present how NLP methods based on the extraction of textual information from the Air France ASR

Paris-Sud XI, Université de

196

High-leverage changes to improve safety culture: A systemic analysis of major organizational accidents  

Microsoft Academic Search

The purpose of this paper is to clarify high-leverage changes indispensable for improving safety culture through organizational learning. Although the concept of safety culture appears to have become increasingly important, there is no established way to improve it. Through systemic analysis and model building of the process of deterioration of safety culture in three recent major organizational accidents, we identified

Shigehisa Tsuchiya; K. Ito; M. Sato

197

Analysis of dental materials as an aid to identification in aircraft accidents  

SciTech Connect

The failure to achieve positive identification of aircrew following an aircraft accident need not prevent a full autopsy and toxicological examination to ascertain possible medical factors involved in the accident. Energy-dispersive electron microprobe analysis provides morphological, qualitative, and accurate quantitative analysis of the composition of dental amalgam. Wet chemical analysis can be used to determine the elemental composition of crowns, bridges and partial dentures. Unfilled resin can be analyzed by infrared spectroscopy. Detailed analysis of filled composite restorative resins has not yet been achieved in the as-set condition to permit discrimination between manufacturers' products. Future work will involve filler studies and pyrolysis of the composite resins by thermogravimetric analysis to determine percentage weight loss when the sample examined is subjected to a controlled heating regime. With these available techniques, corroborative evidence achieved from the scientific study of materials can augment standard forensic dental results to obtain a positive identification.

Wilson, G.S.; Cruickshanks-Boyd, D.W.

1982-04-01

198

Analysis of reactivity-insertion accidents in the TREAT Upgrade reactor  

SciTech Connect

The expansion of the experimental capabilities of the TREAT Upgrade (TU) reactor also tends to increase the potential risks associated with off-normal reactivity insertion incidents compared to the TREAT reactor. To provide adequate prtection for the public and the facility, while meeting experimenter's requirements, a specialized Reactor Trip System (RTS) with energy-dependent scram trips on reactor power and period has been developed. With this protection strategy, the consequences of reactivity insertion accidents in the TU reactor have been analyzed using a general methodology developed earlier. Results of these analyses are presented.

Rudolph, R.R.; Bhattacharyya, S.K.

1983-01-01

199

A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies  

SciTech Connect

This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

Woodward, C S; Estep, D; Sandelin, J; Wang, H

2009-02-26

200

Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing  

NASA Technical Reports Server (NTRS)

Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

2014-01-01

201

Uncertainty analysis of preclosure accident doses for the Yucca Mountain repository  

SciTech Connect

This study presents a generic methodology that can be used to evaluate the uncertainty in the calculated accidental offsite doses at the Yucca Mountain repository during the preclosure period. For demonstration purposes, this methodology is applied to two specific accident scenarios: the first involves a crane dropping an open container with consolidated fuel rods, the second involves container failure during emplacement or removal operations. The uncertainties of thirteen parameters are quantified by various types of probability distributions. The Latin Hypercube Sampling method is used to evaluate the uncertainty of the offsite dose. For the crane-drop scenario with concurrent filter failure, the doses due to the release of airborne fuel particles are calculated to be 0.019, 0.32, and 2.8 rem at confidence levels of 10%, 50%, and 90%, respectively. For the container failure scenario with concurrent filter failure, the 90% confidence-level dose is 0.21 rem. 20 refs., 4 figs., 3 tabs.

Ma, C.W.; Miller, D.D.; Zavoshy, S.J. [Bechtel National, Inc., San Francisco, CA (USA); Jardine, L.J. [Jardine (L.J.) and Associates, Livermore, CA (USA)

1990-12-31

202

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

203

Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines.  

PubMed

Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

Baka, Aikaterini D; Uzunoglu, Nikolaos K

2014-09-01

204

Analysis of Severe Accident Scenarios in APR-1400 Using the MAAP4 Code  

SciTech Connect

An analysis of containment environments during a postulated severe accident was performed for a pressurized water reactor (PWR) type nuclear power plant (NPP) of which electric power is 1400 MWe. Examined were four initiating events such as large break loss-of-coolant accident (LBLOCA), small break loss-of-coolant accident (SBLOCA), total loss of feedwater (TLOFW) and station blackout (SBO). These events are selected based on their risk dominance. Accident progression is divided into four phases in accordance with phenomena occurring in reactor and containment. Several scenarios were established in order to get most severe conditions in each phase. More than a dozen scenarios were analyzed in the present analysis and 10 parameters were closely examined such as maximum core temperature, gas temperatures at core exit and hot-leg, pressure and temperature of pressurizer (PZR), pressure, temperature and hydrogen concentration of each compartment of containment building, in containment refueling water storage tank (IRWST) level and gas temperature in reactor cavity. The maximum temperature and hydrogen concentration were found to vary in accordance with initiating events and compartment locations. (authors)

Jeong, Ji Hwan [Dept of Environmental System, Cheonan College of Foreign Studies, Anseo-dong, Cheonan, Choongnam, 330-705 (Korea, Republic of); Na, M.G.; Kim, S.P. [Dept. of Nuclear Engineering, Chosun University, Susuk-dong, Dong-gu, Gwangju, 501-825 (Korea, Republic of); Park, Jong Woon [Korea Electric Power Research Institute, Moonji-dong, Yusong-gu, Taejon, 305-380 (Korea, Republic of)

2002-07-01

205

Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [GRAM, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

206

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to

F. T. Harper; M. L. Young; L. A. Miller

1995-01-01

207

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report  

Microsoft Academic Search

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate

L. H. J. Goossens; B. C. P. Kraan; R. M. Cooke; J. Boardman; J. A. Jones; F. T. Harper; M. L. Young; S. C. Hora

1997-01-01

208

A Longitudinal Analysis of the Causal Factors in Major Aviation Accidents in the USA from 1976 to 2006  

E-print Network

A Longitudinal Analysis of the Causal Factors in Major Aviation Accidents in the USA from 1976, technical failures and environmental factors play in adverse events. This is important because in accident and incident reports. We are also concerned to determine whether these causal factors have changed

Johnson, Chris

209

Overview of Sandia National Laboratories and Khlopin Radium Institute collaborative radiological accident consequence analysis efforts  

SciTech Connect

In January, 1995 a collaborative effort to improve radiological consequence analysis methods and tools was initiated between the V.G. Khlopin Institute (KRI) and Sandia National Laboratories (SNL). The purpose of the collaborative effort was to transfer SNL`s consequence analysis methods to KRI and identify opportunities for collaborative efforts to solve mutual problems relating to the safety of radiochemical facilities. A second purpose was to improve SNL`s consequence analysis methods by incorporating the radiological accident field experience of KRI scientists (e.g. the Chernobyl and Kyshtym accidents). The initial collaborative effort focused on the identification of: safety criteria that radiochemical facilities in Russia must meet; analyses/measures required to demonstrate that safety criteria have been met; and data required to complete the analyses/measures identified to demonstrate the safety basis of a facility.

Young, M.L.; Carlson, D.D. [Sandia National Labs., Albuquerque, NM (United States); Lazarev, L.N.; Petrov, B.F.; Romanovskiy, V.N. [V.G. Khlopin Radium Inst., St. Petersburg (Russian Federation)

1997-05-01

210

Nutrient analysis methodology: a review of the DINE developmental literature.  

PubMed

In 1986, a collaborative effort among professional associations resulted in the publication of Worksite Nutrition: A Decision Maker's Guide (The American Diabetic Association, 1986). The booklet describes nutrient analysis methodology as a good "promotional gimmick". The development of DINE was an effort to move nutrient analysis from the gimmick level to a viable educational component level. A few examples of the innovative effects of this methodology are (1) individuals' using their own data can learn energy balance by monitoring their food intake and physical activity, (2) individuals can learn the Dietary Goals for the United States (U. S. Senate Select Subcommittee on Nutrition and Human Needs, 1977) and are able graphically to compare how their diet approximates or is different from these goals, and (3) individuals can also learn, from verifications of their own food records, which of their food selections were high in calories, total fat, saturated fat, and cholesterol, and low in complex carbohydrates and dietary fiber. Alternative healthful food choices are identified and the effects of reducing or increasing portion sizes is described. The DINE development team has been working for the past eight years to decrease nutrient analysis variability so that the procedure can be used as an effective independent measure to improve nutritional behavior. Research has been conducted related to database validity and reliability. Formative and process evaluations have been conducted to improve interactive aspects of the software and related manuals and books. DINE procedures have been modified for ease of use, in general, and specifically for elementary students and university students. PMID:2516071

Dennison, D; Dennison, K F

1989-12-01

211

Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.  

SciTech Connect

The Department of Energy has assigned to Sandia National Laboratories the responsibility of producing a Safety Analysis Report (SAR) for the plutonium-dioxide fueled Multi-Mission Radioisotope Thermoelectric Generator (MMRTG) proposed to be used in the Mars Science Laboratory (MSL) mission. The National Aeronautic and Space Administration (NASA) is anticipating a launch in fall of 2009, and the SAR will play a critical role in the launch approval process. As in past safety evaluations of MMRTG missions, a wide range of potential accident conditions differing widely in probability and seventy must be considered, and the resulting risk to the public will be presented in the form of probability distribution functions of health effects in terms of latent cancer fatalities. The basic descriptions of accident cases will be provided by NASA in the MSL SAR Databook for the mission, and on the basis of these descriptions, Sandia will apply a variety of sophisticated computational simulation tools to evaluate the potential release of plutonium dioxide, its transport to human populations, and the consequent health effects. The first step in carrying out this project is to evaluate the existing computational analysis tools (computer codes) for suitability to the analysis and, when appropriate, to identify areas where modifications or improvements are warranted. The overall calculation of health risks can be divided into three levels of analysis. Level A involves detailed simulations of the interactions of the MMRTG or its components with the broad range of insults (e.g., shrapnel, blast waves, fires) posed by the various accident environments. There are a number of candidate codes for this level; they are typically high resolution computational simulation tools that capture details of each type of interaction and that can predict damage and plutonium dioxide release for a range of choices of controlling parameters. Level B utilizes these detailed results to study many thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald (Sala & Associates); Bessette, Gregory Carl; Lipinski, Ronald J.

2006-09-01

212

Accident Analysis and Prevention 40 (2008) 12441248 Short communication  

E-print Network

Intervention analysis (Box and Tiao, 1976; Cook and Campbell, 1979, Chapter 6; Hipel and McLeod, 1994, Chap, 2001). Corresponding author. Tel.: +1 519 661 3611. E-mail address: aimcleod@uwo.ca (A.I. McLeod

McLeod, Ian

213

[Analysis of radiation-hygienic and medical consequences of the Chernobyl accident].  

PubMed

Since the day of "the Chernobyl accident" in 1986 more than 25 years have been past. Radioactively contaminated areas 14 subjects of the Russian Federation with a total area of more than 50 thousand km2, where 1.5 million people now reside were exposed to radioactive contamination. Currently, a system of comprehensive evaluation of radiation doses of the population affected by the "Chernobyl accidents", including 11 guidance documents has been created. There are methodically provided works on the assessment of average annual, accumulated and predicted radiation doses of population and its critical groups, as well as doses to the thyroid gland The relevance of the analysis of the consequences of the "Chernobyl accident" is demonstrated by the events in Japan, at nuclear power Fukusima-1. In 2011 - 20/2 there were carried out comprehensive maritime expeditions under the auspices of the Russian Geographical Society with the participation of relevant ministries and agencies, leading academic institutions in Russia. In 2012, work was carried out on radiation protection of the population from the potential transboundary impact of the accident at the Japanese nuclear power plant Fukushima-l. The results provide a basis for the favorable outlook for the radiation environment in our Far East and the Pacific coast of Russia. PMID:24340594

Onishchenko, G G

2013-01-01

214

Development and application of proton NMR methodology to lipoprotein analysis  

NASA Astrophysics Data System (ADS)

The present thesis describes the development of 1H NMR spectroscopy and its applications to lipoprotein analysis in vitro, utilizing biochemical prior knowledge and advanced lineshape fitting analysis in the frequency domain. A method for absolute quantification of lipoprotein lipids and proteins directly from the terminal methyl-CH3 resonance region of 1H NMR spectra of human blood plasma is described. Then the use of NMR methodology in time course studies of the oxidation process of LDL particles is presented. The function of the cholesteryl ester transfer protein (CETP) in lipoprotein mixtures was also assessed by 1H NMR, which allows for dynamic follow-up of the lipid transfer reactions between VLDL, LDL, and HDL particles. The results corroborated the suggestion that neutral lipid mass transfer among lipoproteins is not an equimolar heteroexchange. A novel method for studying lipoprotein particle fusion is also demonstrated. It is shown that the progression of proteolytically (?- chymotrypsin) induced fusion of LDL particles can be followed by 1H NMR spectroscopy and, moreover, that fusion can be distinguished from aggregation. In addition, NMR methodology was used to study the changes in HDL3 particles induced by phospholipid transfer protein (PLTP) in HDL3 + PLTP mixtures. The 1H NMR study revealed a gradual production of enlarged HDL particles, which demonstrated that PLTP-mediated remodeling of HDL involves fusion of the HDL particles. These applications demonstrated that the 1H NMR approach offers several advantages both in quantification and in time course studies of lipoprotein-lipoprotein interactions and of enzyme/lipid transfer protein function.

Korhonen, Ari Juhani

1998-11-01

215

Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.  

PubMed

Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions as well as the causes of non-compliance with SMS. PMID:23764875

Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

2013-10-01

216

Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D. [GRAM, Inc., Albuquerque, NM (United States); McKay, M.D. [Los Alamos National Lab., NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

217

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H  

SciTech Connect

This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

218

Space Shuttle Columbia Post-Accident Analysis and Investigation  

NASA Technical Reports Server (NTRS)

Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

McDanels, Steven J.

2006-01-01

219

Risk analysis and risk management for offshore platforms: Lessons from the Piper Alpha accident  

SciTech Connect

A probabilistic risk analysis (PRA) framework is used to identify the accident sequence of the 1988 Piper Alpha accident. This framework is extended to include the human decisions and actions that have influenced the occurrences of these basic events, and their organizational roots. The results of this preliminary analysis allow identification of a wide spectrum of possible risk reduction measures, ranging from classical technical solutions such as addition of redundancies, to organizational improvements such as a change in the maintenance procedures. An explicit PRA model is then developed to assess the benefits of some of these safety measures based, first, on the original contribution to the overall risk of the failure modes that these measures are designed to avert, and second, on the degree to which they can reduce the probabilities of these failure modes. PRA can then be used as a management tool, allowing optimization of risk management strategies based both on the qualitative information about causalities provided by the accident, and on the quantitative information about failure probabilities updated in the light of new events. It is shown how PRA can be used to assess, for example, the cost-effectiveness of safety measures designed to decrease the probability of severe fire damage onboard platforms similar to Piper Alpha.

Pate-Cornell, M.E. (Stanford Univ, CA (United States). Dept. of Industrial Engineering and Engineering Management)

1993-08-01

220

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1998-04-01

221

Preliminary analysis of graphite dust releasing behavior in accident for HTR  

SciTech Connect

The behavior of the graphite dust is important to the safety of High Temperature Gas-cooled Reactors. This study investigated the flow of graphite dust in helium mainstream. The analysis of the stresses acting on the graphite dust indicated that gas drag played the absolute leading role. Based on the understanding of the importance of gas drag, an experimental system is set up for the research of dust releasing behavior in accident. Air driven by centrifugal fan is used as the working fluid instead of helium because helium is expensive, easy to leak which make it difficult to seal. The graphite particles, with the size distribution same as in HTR, are added to the experiment loop. The graphite dust releasing behavior at the loss-of-coolant accident will be investigated by a sonic nozzle. (authors)

Peng, W.; Yang, X. Y.; Yu, S. Y.; Wang, J. [Inst. of Nuclear and New Energy Technology, Tsinghua Univ., Beijing100084 (China)

2012-07-01

222

Safety culture and accident analysis--a socio-management approach based on organizational safety social capital.  

PubMed

One of the biggest challenges for organizations in today's competitive business environment is to create and preserve a self-sustaining safety culture. Typically, the key drivers of safety culture in many organizations are regulation, audits, safety training, various types of employee exhortations to comply with safety norms, etc. However, less evident factors like networking relationships and social trust amongst employees, as also extended networking relationships and social trust of organizations with external stakeholders like government, suppliers, regulators, etc., which constitute the safety social capital in the Organization--seem to also influence the sustenance of organizational safety culture. Can erosion in safety social capital cause deterioration in safety culture and contribute to accidents? If so, how does it contribute? As existing accident analysis models do not provide answers to these questions, CAMSoC (Curtailing Accidents by Managing Social Capital), an accident analysis model, is proposed. As an illustration, five accidents: Bhopal (India), Hyatt Regency (USA), Tenerife (Canary Islands), Westray (Canada) and Exxon Valdez (USA) have been analyzed using CAMSoC. This limited cross-industry analysis provides two key socio-management insights: the biggest source of motivation that causes deviant behavior leading to accidents is 'Faulty Value Systems'. The second biggest source is 'Enforceable Trust'. From a management control perspective, deterioration in safety culture and resultant accidents is more due to the 'action controls' rather than explicit 'cultural controls'. Future research directions to enhance the model's utility through layering are addressed briefly. PMID:16911855

Rao, Suman

2007-04-11

223

A comparative analysis of methodologies for database schema integration  

Microsoft Academic Search

One of the fundamental principles of the database approach is that a database allows a nonredundant, unified representation of all data managed in an organization. This is achieved only when methodologies are available to support integration across organizational and application boundaries. Methodologies for database design usually perform the design activity by separately producing several schemas, representing parts of the application,

Carlo Batini; Maurizio Lenzerini; Shamkant B. Navathe

1986-01-01

224

APPLICATION OF RESPONSE SURFACE METHODOLOGY IN NUMERICAL GEOTECHNICAL ANALYSIS  

Microsoft Academic Search

Concepts and techniques of Response Surface Methodology (RSM) have been extensively applied in many branches of engineering, especially in the chemical and manufacturing areas. This paper presents an application of the methodology in Geotechnical Engineering, in which evaluating the significance of a large number of factors or parameters and their interactions, as well as obtaining a simple relationship that defines

Neda Zangeneh; Alireza Azizian; Leonard Lye; Radu Popescu

2002-01-01

225

Robustness of an uncertainty and sensitivity analysis of early exposure results with the MACCS reactor accident consequence model  

Microsoft Academic Search

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis were used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The following results were obtained in tests to check the robustness of the analysis techniques: two independent Latin hypercube

J. C. Helton; J. D. Johnson; M. D. McKay; A. W. Shiver; J. L. Sprung

1995-01-01

226

Fire risk analysis for nuclear power plants: Methodological developments and applications  

Microsoft Academic Search

A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis

Mardyros Kazarians; G. Apostolakis; N. O. Siv

1985-01-01

227

Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)  

SciTech Connect

The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

Johnson, E.W.

1988-10-01

228

Routes to failure: Analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system  

Microsoft Academic Search

The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring

Wen-Chin Li; Don Harris; Chung-San Yu

2008-01-01

229

ADHD and relative risk of accidents in road traffic: a meta-analysis.  

PubMed

The present meta-analysis is based on 16 studies comprising 32 results. These studies provide sufficient data to estimate relative accident risks of drivers with ADHD. The overall estimate of relative risk for drivers with ADHD is 1.36 (95% CI: 1.18; 1.57) without control for exposure, 1.29 (1.12; 1.49) when correcting for publication bias, and 1.23 (1.04; 1.46) when controlling for exposure. A relative risk (RR) of 1.23 is exactly the same as found for drivers with cardiovascular diseases. The long-lasting assertion that "ADHD-drivers have an almost fourfold risk of accident compared to non-ADHD-drivers", which originated from Barkley et al.'s study of 1993, is rebutted. That estimate was associated with comorbid Oppositional Defiant Disorder (ODD) and/or Conduct Disorder (CD), not with ADHD, but the assertion has incorrectly been maintained for two decades. The present study provides some support for the hypothesis that the relative accident risk of ADHD-drivers with comorbid ODD, CD and/or other conduct problems, is higher than that of ADHD-drivers without these comorbidities. The estimated RRs were 1.86 (1.27; 2.75) in a sample of ADHD-drivers in which a majority had comorbid ODD and/or CD compared to 1.31 (0.96; 1.81) in a sample of ADHD-drivers with no comorbidity. Given that ADHD-drivers most often seem to drive more than controls, and the fact that a majority of the present studies lack information about exposure, it seems more probable that the true RR is lower rather than higher than 1.23. Also the assertion that ADHD-drivers violate traffic laws more often than other drivers should be modified: ADHD-drivers do have more speeding violations, but no more drunk or reckless driving citations than drivers without ADHD. All accident studies included in the meta-analysis fail to acknowledge the distinction between deliberate violations and driving errors. The former are known to be associated with accidents, the latter are not. A hypothesis that ADHD-drivers speed more frequently than controls because it stimulates attention and reaction time is suggested. PMID:24238842

Vaa, Truls

2014-01-01

230

Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)  

SciTech Connect

This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

Whitehead, D. [Sandia National Labs., Albuquerque, NM (United States); Darby, J. [Science and Engineering Associates, Inc., Albuquerque, NM (United States); Yakle, J. [Science Applications International Corp., Albuquerque, NM (United States)] [and others

1994-06-01

231

Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident  

SciTech Connect

An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

Aldrich, D.C.; Blond, R.M.

1980-01-01

232

The Analysis of PWR SBO Accident with RELAP5 Based on Linux  

NASA Astrophysics Data System (ADS)

RELAP5 is a relatively advanced light water reactor transient hydraulic and thermal analysis code, and it owns the signality of the safe-operating of nuclear reactor system when the safety analysis and operating simulation of the system was done with RELAP5. The RELAP5 operating mode based on Linux operating system was presented in this paper, utilizing Linux operating system's powerful document processing capabilities to deal with the output file of the RELAP5 for the valid data directly, and taking advantage of the system's programmable capabilities to improve the drawing functions of RELAP5. After the operating in Linux system, the precision of the calculating results is guaranteed and the period of the computing is shortened. During the work, for PWR Station Blackout (SBO) accident, the computing with RELAP5 based on Linux and Windows was respectively made. Through the comparison and analysis of the accident response curve of the main parameters such as power of nuclear reactor, average temperature and pressure of primary loop, it shows the operating analysis of nuclear reactor system is safe and reliable with RELAP5 based on Linux.

Xia, Zhimin; Zhang, Dafa

233

Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document  

SciTech Connect

Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

Johnson, E.W.

1985-10-01

234

Worldwide Husbanding Process Improvement: Comparative Analysis of Contracting Methodologies.  

National Technical Information Service (NTIS)

This study is designed to support one of three major focus areas in the Naval Supply Systems Command (NAVSUP) Worldwide Husbanding Improvement Process initiative. Existing contracting methodologies were analyzed using the following methods: characteristic...

J. Pitel, M. Gundemir, P. Metzger, R. Manalang

2007-01-01

235

A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?  

NASA Technical Reports Server (NTRS)

In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

Holloway, C. M.; Johnson, C. W.

2007-01-01

236

Hypothetical accident condition thermal analysis and testing of a Type B drum package  

SciTech Connect

A thermophysical property model developed to analytically determine the thermal response of cane fiberboard when exposed to temperatures and heat fluxes associated with the 10 CFR 71 hypothetical accident condition (HAC) has been benchmarked against two Type B drum package fire test results. The model 9973 package was fire tested after a 30 ft. top down drop and puncture, and an undamaged model 9975 package containing a heater (21W) was fire tested to determine content heat source effects. Analysis results using a refined version of a previously developed HAC fiberboard model compared well against the test data from both the 9973 and 9975 packages.

Hensel, S.J.; Alstine, M.N. Van; Gromada, R.J.

1995-07-01

237

A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test  

NASA Astrophysics Data System (ADS)

According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis

Tabibzadeh, Maryam

238

NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug  

NASA Technical Reports Server (NTRS)

A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

2005-01-01

239

PerfCenter: A Methodology and Tool for Performance Analysis of Application Hosting  

E-print Network

PerfCenter: A Methodology and Tool for Performance Analysis of Application Hosting Centers Rukma P between servers. While tools and methodologies for such analysis have been proposed earlier, our approach model network links between LANs and model the contention at those links due to messages exchanged

Apte, Varsha

240

The grande syntagmatique: A methodology for analysis of the montage structure of television narratives  

Microsoft Academic Search

The article presents a methodology for analyzing the montage structure of television programs. The methodology was developed by Christian Metz and originally used in the analysis of films. The results of an analysis of three episodes of Lou Grant are reported, revealing similarities among the montage structures of the episodes. These data are then compared to an episode of an

Michael J. Porter

1982-01-01

241

A methodological procedure for the analysis of the Wenxian covenant texts  

E-print Network

This article introduces a systematic methodological procedure for the analysis of Chinese palaeographic materials, constructed in this instance for the analysis ofthe Wenxian covenant texts {Wënxiàn mêngshû ÌSHS§). The ...

Williams, Crispin; ???

2005-01-01

242

Review Integrating multiple `omics' analysis for microbial biology: application and methodologies  

E-print Network

Review Integrating multiple `omics' analysis for microbial biology: application and methodologies within a cell over time. However, no single `omics' analysis can fully unravel the complexities of fundamental microbial biology. Therefore, integration of multiple layers of information, the multi- `omics

Economou, Tassos

243

Hospital multifactor productivity: a presentation and analysis of two methodologies.  

PubMed

In response to recent discussions regarding the ability of hospitals to achieve gains in productivity, we present two methodologies that attempt to measure multifactor productivity (MFP) in the hospital sector. We analyze each method and conclude that the inconsistencies in their outcomes make it difficult to estimate a precise level of MFP that hospitals have historically achieved. Our goal in developing two methodologies is to inform the debate surrounding the ability of hospitals to achieve gains in MFP, as well as to highlight some of the challenges that exist in measuring hospital MFP. PMID:18435223

Cylus, Jonathan D; Dickensheets, Bridget A

244

Hospital Multifactor Productivity: A Presentation and Analysis of Two Methodologies  

PubMed Central

In response to recent discussions regarding the ability of hospitals to achieve gains in productivity, we present two methodologies that attempt to measure multifactor productivity (MFP) in the hospital sector We analyze each method and conclude that the inconsistencies in their outcomes make it difficult to estimate a precise level of MFP that hospitals have historically achieved. Our goal in developing two methodologies is to inform the debate surrounding the ability of hospitals to achieve gains in MFP, as well as to highlight some of the challenges that exist in measuring hospital MFP. PMID:18435223

Cylus, Jonathan D.; Dickensheets, Bridget A.

2007-01-01

245

Natural phenomena risk analysis - an approach for the tritium facilities 5480.23 SAR natural phenomena hazards accident analysis  

SciTech Connect

A Tritium Facilities (TF) Safety Analysis Report (SAR) has been developed which is compliant with DOE Order 5480.23. The 5480.23 SAR upgrades and integrates the safety documentation for the TF into a single SAR for all of the tritium processing buildings. As part of the TF SAR effort, natural phenomena hazards (NPH) were analyzed. A cost effective strategy was developed using a team approach to take advantage of limited resources and budgets. During development of the Hazard and Accident Analysis for the 5480.23 SAR, a strategy was required to allow maximum use of existing analysis and to develop a cost effective graded approach for any new analysis in identifying and analyzing the bounding accidents for the TF. This approach was used to effectively identify and analyze NPH for the TF. The first part of the strategy consisted of evaluating the current SAR for the RTF to determine what NPH analysis could be used in the new combined 5480.23 SAR. The second part was to develop a method for identifying and analyzing NPH events for the older facilities which took advantage of engineering judgment, was cost effective, and followed a graded approach. The second part was especially challenging because of the lack of documented existing analysis considered adequate for the 5480.23 SAR and a limited budget for SAR development and preparation. This paper addresses the strategy for the older facilities.

Cappucci, A.J. Jr.; Joshi, J.R.; Long, T.A.; Taylor, R.P.

1997-07-01

246

Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks  

SciTech Connect

Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.

Bri Rolston

2005-06-01

247

Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR  

SciTech Connect

The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T. [Japan Nuclear Energy Safety Organization JNES, Toranomon Towers Office, 4-1-28, Toranomon, Minato-ku, Tokyo (Japan); Shirakawa, N. [Inst. of Applied Energy IAE, Shimbashi SY Bldg., 14-2 Nishi-Shimbashi 1-Chome, Minato-ku, Tokyo (Japan)

2012-07-01

248

Fire accident analysis modeling in support of non-reactor nuclear facilities at Sandia National Laboratories  

SciTech Connect

The Department of Energy (DOE) requires that fire hazard analyses (FHAs) be conducted for all nuclear and new facilities, with results for the latter incorporated into Title I design. For those facilities requiring safety analysis documentation, the FHA shall be documented in the Safety Analysis Reports (SARs). This paper provides an overview of the methodologies and codes being used to support FHAs at Sandia facilities, with emphasis on SARs.

Restrepo, L.F.

1993-06-01

249

Process hazards analysis (PrHA) program, bridging accident analyses and operational safety  

SciTech Connect

Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker safety are incorporated so the worker can readily identify the safety parameters of the their work. System safety tools such as Preliminary Hazard Analysis, What-If Analysis, Hazard and Operability Analysis as well as other techniques as necessary provide the groundwork for both determining bounding conditions for facility safety, operational safety, and day-to-clay worker safety.

Richardson, J. A. (Jeanne A.); McKernan, S. A. (Stuart A.); Vigil, M. J. (Michael J.)

2003-01-01

250

Unconscious linguistic referents to race: analysis and methodological frameworks  

Microsoft Academic Search

Recent years have seen considerable development in methodological designs for accessing and eliciting unconscious cognitive schemata in response to social stimuli, including race. One design is experimental and involves the priming and automatic activation of schemata. Another design is a specifically developed psycho-linguistic and logico-mathematic method for recognizing, analyzing, and validating unconsciously expressed meaning in verbal narratives, referred to as

Robert E. Haskell

2009-01-01

251

Measurement of sexual aggression in college men: A methodological analysis  

Microsoft Academic Search

Researchers have devoted increased attention in recent years to the measurement of sexual aggression in college populations. This review describes and critically examines current methods of measuring sexual aggression which rely on a self-reported history of such behavior. We suggest that the construct validity of these approaches can be enhanced through a systematic consideration of instrumentation and methodological issues. Twenty-six

James F. Porter; Joseph W. Critelli

1992-01-01

252

Investigation of Pedestrian Accidents Analysis at signalised pedestrian crossings in Edinburgh  

Microsoft Academic Search

Data from STATS 19 show that pedestrian accident rates are higher over the pedestrian crossing points than these away from it, or within 50 meters of pedestrian crossing facilities. This is contrary to the expectations that accidents should be least over these crossing facilities. This study investigates in more details the factors that affect accidents occurrence at pelican crossing and

Khalfan Alnaqbi

253

Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis  

Microsoft Academic Search

A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected.

M. K. Goldhaber; S. L. Staub; G. K. Tokuhata

1983-01-01

254

Risk and protection factors in fatal accidents  

Microsoft Academic Search

This paper aims at addressing the interest and appropriateness of performing accident severity analyses that are limited to fatal accident data. Two methodological issues are specifically discussed, namely the accident-size factors (the number of vehicles in the accident and their level of occupancy) and the comparability of the baseline risk. It is argued that – although these two issues are

Emmanuelle Dupont; Heike Martensen; Eleonora Papadimitriou; George Yannis

2010-01-01

255

Analysis of the SL-1 Accident Using RELAPS5-3D  

SciTech Connect

On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

Francisco, A.D. and Tomlinson, E. T.

2007-11-08

256

Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report  

SciTech Connect

The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

1986-09-01

257

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

258

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

259

Analysis of accidents during the mid-loop operating state at a PWR  

SciTech Connect

Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

1992-12-31

260

DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS  

SciTech Connect

Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

Wu, T

2008-04-30

261

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

1997-12-01

262

Analysis of accidents during the mid-loop operating state at a PWR  

SciTech Connect

Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

1992-01-01

263

Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors  

SciTech Connect

The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

Pate-Cornell, M.E. (Stanford Univ., CA (United States))

1993-04-01

264

Calculational methodology and associated uncertainties: Sensitivity and uncertainty analysis of reactor performance parameters  

Microsoft Academic Search

This chapter considers the calculational methodology and associated uncertainties both for the design of large LMFBR's and the analysis of critical assemblies (fast critical experiments) as performed by several groups within the US. Discusses cross-section processing; calculational methodology for the design problem; core physics computations; design-oriented approximations; benchmark analyses; and determination of calculational corrections and associated uncertainties for a critical

E. Kujawski; C. R. Weisbin

1982-01-01

265

Toward a Methodology of Experimental Analysis and Treatment of Aberrant Classroom Behaviors  

Microsoft Academic Search

A behavior analytic methodlogy for linking assessment and intervention for aberrant classroom behavior is presented. This methodology assists the behavioral consultant and classroom teacher in looking at the environmental variables that support undesirable behavior in three ways: structurally, functionally, or through some combination of the two approaches. Treatment strategies arising from the analysis may then be implemented. The methodology consists

F. Charles Mace; Mary Ann Yankanich; Barbara J. West

1989-01-01

266

Multi-criteria analysis weighting methodology to incorporate stakeholders' preferences in energy and climate policy interactions  

Microsoft Academic Search

Purpose – Evaluation of energy and climate policy interactions is a complex issue, whereas stakeholders' preferences incorporation has not been addressed systematically. The purpose of this paper is to present an integrated weighting methodology that has been developed in order to incorporate weighting preferences into an ex ante evaluation of climate and energy policy interactions. Design\\/methodology\\/approach – A multi-criteria analysis

Stelios Grafakos; Alexandros Flamos; Vlasis Oikonomou; Dimitrios Zevgolis

2010-01-01

267

Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis  

ERIC Educational Resources Information Center

This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

Kover, Sara T.; Atwood, Amy K.

2013-01-01

268

Cost-Effectiveness Analysis of Prenatal Diagnosis: Methodological Issues and Concerns  

Microsoft Academic Search

With increasing concerns regarding rapidly expanding health care costs, cost-effectiveness analysis (CEA) provides a methodology to assess whether marginal gains from new technology are worth the increased costs. In the arena of prenatal diagnosis, particular methodological and ethical concerns include whether the effects of such testing on individuals other than the patient are included, how termination of pregnancy is included

Aaron B. Caughey

2005-01-01

269

Modeling Techniques for a Risk Analysis Methodology for Software Systems  

E-print Network

conditions hold throughout the operation of the software system. Finally, dynamic fault tree analysis should be performed on the system to analyze each hazard or failure event. Fault tree analysis is a technique Modeling 12 UML 12 Z ­ Formal Specification 15 Analysis 19 Z/EVES 19 Dynamic Fault Tree Analysis 21 #12

270

The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents  

NASA Technical Reports Server (NTRS)

In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

Ancel, Ersin; Shih, Ann T.

2012-01-01

271

Modeling & analysis of core debris recriticality during hypothetical severe accidents in the Advanced Neutron Source Reactor  

SciTech Connect

This paper discusses salient aspects of severe-accident-related recriticality modeling and analysis in the Advanced Neutron Source (ANS) reactor. The development of an analytical capability using the KEN05A-SCALE system is described including evaluation of suitable nuclear cross-section sets to account for the effects of system geometry, mixture temperature, material dispersion and other thermal-hydraulic conditions. Benchmarking and validation efforts conducted with KEN05-SCALE and other neutronic codes against critical experiment data are described. Potential deviations and biases resulting from use of the 16-group Hansen-Roach library are shown. A comprehensive test matrix of calculations to evaluate the threat of a criticality event in the ANS is described. Strong dependencies on geometry, material constituents, and thermal-hydraulic conditions are described. The introduction of designed mitigative features are described.

Kim, S.H.; Georgevich, V.; Simpson, D.B.; Slater, C.O.; Taleyarkhan, R.P.

1992-10-01

272

Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor  

SciTech Connect

This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

1992-10-01

273

Accident safety analysis for 300 Area N Reactor Fuel Fabrication and Storage Facility  

SciTech Connect

The purpose of the accident safety analysis is to identify and analyze a range of credible events, their cause and consequences, and to provide technical justification for the conclusion that uranium billets, fuel assemblies, uranium scrap, and chips and fines drums can be safely stored in the 300 Area N Reactor Fuel Fabrication and Storage Facility, the contaminated equipment, High-Efficiency Air Particulate filters, ductwork, stacks, sewers and sumps can be cleaned (decontaminated) and/or removed, the new concretion process in the 304 Building will be able to operate, without undue risk to the public, employees, or the environment, and limited fuel handling and packaging associated with removal of stored uranium is acceptable.

Johnson, D.J.; Brehm, J.R.

1994-01-01

274

Methodology for Computer-Aided Fault Tree Analysis  

Microsoft Academic Search

Fault tree analysis is a systematic, deductive and probabilistic risk assessment tool which elucidates the causal relations leading to a given undesired event. Quantitative fault tree (failure) analysis requires a fault tree and failure data of basic events. Development of a fault tree and subsequent analysis require a great deal of expertise, which may not be available all the time.

R. Ferdous; F. I. Khan; B. Veitch; P. R. Amyotte

2007-01-01

275

Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology  

SciTech Connect

This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

J. Scaglione

1999-09-09

276

Building Energy Performance Analysis of an Academic Building Using IFC BIM-Based Methodology  

E-print Network

This paper discusses the potential to use an Industry Foundation Classes (IFC)/Building Information Modelling (BIM) based method to undertake Building Energy Performance analysis of an academic building. BIM/IFC based methodology provides a...

Aziz, Z.; Arayici, Y.; Shivachev, D.

2012-01-01

277

Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H  

SciTech Connect

'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

Blanchard, A.

1999-05-10

278

Analysis of the crush environment for lightweight air-transportable accident-resistant containers  

SciTech Connect

This report describes the longitudinal dynamic crush environment for a Lightweight Air-Transportable Accident-Resistant Container (LAARC, now called PAT-2) that can be used to transport small quantities of radioactive material. The analysis of the crush environment involves evaluation of the forces imposed upon the LAARC package during the crash of a large, heavily loaded, cargo aircraft. To perform the analysis, a cargo load column was defined which consisted of a longitudinal prism of cargo of cross-sectional area equal to the projected area of the radioactive-material package and length equal to the longitudinal extent of the cargo compartment in a commercial cargo jet aircraft. To bound the problem, two analyses of the cargo load column were performed, a static stability analysis and a dynamic analysis. The results of these analyses can be applied to other packaging designs and suggest that the physical limits or magnitude of the longitudinal crush forces, which are controlled in part by the yield strength of the cargo and the package size, are much smaller than previously estimated.

McClure, J.D.; Hartman, W.F.

1981-12-01

279

Accident management information needs  

SciTech Connect

In support of the US Nuclear Regulatory Commission (NRC) Accident Management Research Program, a methodology has been developed for identifying the plant information needs necessary for personnel involved in the management of an accident to diagnose that an accident is in progress, select and implement strategies to prevent or mitigate the accident, and monitor the effectiveness of these strategies. This report describes the methodology and presents an application of this methodology to a Pressurized Water Reactor (PWR) with a large dry containment. A risk-important severe accident sequence for a PWR is used to examine the capability of the existing measurements to supply the necessary information. The method includes an assessment of the effects of the sequence on the measurement availability including the effects of environmental conditions. The information needs and capabilities identified using this approach are also intended to form the basis for more comprehensive information needs assessment performed during the analyses and development of specific strategies for use in accident management prevention and mitigation. 3 refs., 16 figs., 7 tabs.

Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. (EG and G Idaho, Inc., Idaho Falls, ID (USA))

1990-04-01

280

Quantifying Drosophila food intake: comparative analysis of current methodology  

PubMed Central

Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

2014-01-01

281

TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions  

Microsoft Academic Search

TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes

N. J. Lombardo; T. J. Marseille; M. D. White; P. S. Lowery

1990-01-01

282

Injuries to New Zealanders participating in adventure tourism and adventure sports: an analysis of Accident Compensation Corporation (ACC) claims  

Microsoft Academic Search

Aims The aim of this study was to examine the involvement of adventure tourism and adventure sports activity in injury claims made to the Accident Compensation Corporation (ACC). Methods Epidemiological analysis of ACC claims for the period, July 2004 to June 2005, where adventure activities were involved in the injury. Results 18,697 adventure tourism and adventure sports injury claims were

Tim Bentley; Keith Mack; Jo Edwards

283

Case - control analysis of leukaemia among Chernobyl accident emergency workers residing in the Russian Federation, 1986 - 1993  

Microsoft Academic Search

This paper presents an analysis of data of the Russian National Medical and Dosimetric Registry on the incidence of leukaemia among 155 680 male Chernobyl accident emergency workers (EWs) who were resident in the Russian Federation (RF) for the period from 1986 to the end of 1993. The system of collection and verification of data on leukaemia is described. 48

V. K. Ivanov; A. F. Tsyb; A. P. Konogorov; E. M. Rastopchin; S. E. Khait

1997-01-01

284

An analysis to determine correlations of freeway traffic accidents with specific geometric design features  

E-print Network

with the evaluation of accident exposure indices, and it was found that the "exposure to accidents is proportional to the product of the number of cars in the two merging or diverging traffic streams. " Other research is in progress. A preliminary report on a study... with the evaluation of accident exposure indices, and it was found that the "exposure to accidents is proportional to the product of the number of cars in the two merging or diverging traffic streams. " Other research is in progress. A preliminary report on a study...

Smith, Frank Miller

2012-06-07

285

Fault Tree Analysis: An Emerging Methodology for Instructional Science.  

ERIC Educational Resources Information Center

Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)

Wood, R. Kent; And Others

1979-01-01

286

Methodology for computer aided fuzzy fault tree analysis  

Microsoft Academic Search

Probabilistic risk assessment (PRA) is a comprehensive, structured and logical analysis method aimed at identifying and assessing risks of complex process systems. PRA uses fault tree analysis (FTA) as a tool to identify basic causes leading to an undesired event, to represent logical dependency of these basic causes in leading to the event, and finally to calculate the probability of

Refaul Ferdous; Faisal Khan; Brian Veitch; Paul R. Amyotte

2009-01-01

287

On the Application of Syntactic Methodologies in Automatic Text Analysis.  

ERIC Educational Resources Information Center

Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

Salton, Gerard; And Others

1990-01-01

288

Methodology for Multifractal Analysis of Heart Rate Variability: From LF/HF Ratio to Wavelet Leaders  

E-print Network

Methodology for Multifractal Analysis of Heart Rate Variability: From LF/HF Ratio to Wavelet introduction to the practical use of wavelet Leader based multifractal analysis to study heart rate variability to other standard characterizations of heart rate variability: (mono)fractal analysis, Hurst exponent

Gonçalves, Paulo

289

TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events  

SciTech Connect

The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

2013-11-10

290

Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident  

SciTech Connect

This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

1988-01-01

291

General Methodology Combining Engineering Optimization of Primary HVAC&R Plants with Decision Analysis Methods—Part I: Deterministic Analysis  

Microsoft Academic Search

This paper is the first of a two-part sequence that proposes a general methodology for dynamic scheduling and optimal control of complex primary HVAC&R plants, which combines engineering analyses within a practical decision analysis framework by modeling risk attitudes of the operator. The methodology involves a computationally efficient, deterministic engineering optimization phase for scheduling and controlling primary systems over the

Wei Jiang; T. Agami Reddy

2007-01-01

292

Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design  

NASA Technical Reports Server (NTRS)

A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

2004-01-01

293

A Novel Methodology for Thermal Analysis & 3-Dimensional Memory Integration  

E-print Network

The semiconductor industry is reaching a fascinating confluence in several evolutionary trends that will likely lead to a number of revolutionary changes in the design, implementation, scaling, and the use of computer systems. However, recently Moore's law has come to a stand-still since device scaling beyond 65 nm is not practical. 2D integration has problems like memory latency, power dissipation, and large foot-print. 3D technology comes as a solution to the problems posed by 2D integration. The utilization of 3D is limited by the problem of temperature crisis. It is important to develop an accurate power profile extraction methodology to design 3D structure. In this paper, design of 3D integration of memory is considered and hence the static power dissipation of the memory cell is analysed in transistor level and is used to accurately model the inter-layer thermal effects for 3D memory stack. Subsequently, packaging of the chip is considered and modelled using an architecture level simulator. This modelli...

Cherian, Annmol; Jose, Jemy; Pangracious, Vinod; 10.5121/ijait.2011.1403

2011-01-01

294

A UNIFIED METHODOLOGY FOR SEISMIC WAVEFORM ANALYSIS AND INVERSION  

E-print Network

(sebastien.chevrot@cnes.fr). Other data analysis and inversion tools were developed using Python (http://www.python....................................................36 2.4.1 Effects of Windowing and Narrow-band Filtering on Target Waveform

Chen, Po

295

Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

Sleight, David W.

1999-01-01

296

Methodology for statistical analysis of SENCAR mouse skin assay data.  

PubMed Central

Various response measures and statistical methods appropriate for the analysis of data collected in the SENCAR mouse skin assay are examined. The characteristics of the tumor response data do not readily lend themselves to the classical methods for hypothesis testing. The advantages and limitations of conventional methods of analysis and methods recommended in the literature are discussed. Several alternative response measures that were developed specifically to answer the problems inherent in the data collected in the SENCAR bioassay system are described. These measures take into account animal survival, tumor multiplicity, and tumor regression. Statistical methods for the analysis of these measures to test for a positive dose response and a dose-response relationship are discussed. Sample data from representative initiation/promotion studies are used to compare the response measures and methods of analysis. PMID:3780632

Stober, J A

1986-01-01

297

Vehicle crash accident reconstruction based on the analysis 3D deformation of the auto-body  

Microsoft Academic Search

The objective of vehicle crash accident reconstruction is to investigate the pre-impact velocity. Elastic–plastic deformation of the vehicle and the collision objects are the important information produced during vehicle crash accidents, and the information can be fully utilized based on the finite element method (FEM), which has been widely used as simulation tools for crashworthiness analyses and structural optimization design.

Xiao-yun Zhang; Xian-long Jin; Wen-guo Qi; Yi-zhi Guo

2008-01-01

298

Methodologies and techniques for analysis of network flow data  

SciTech Connect

Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

Bobyshev, A.; Grigoriev, M.; /Fermilab

2004-12-01

299

Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications  

PubMed Central

Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

Lourenco, Celia; Turner, Claire

2014-01-01

300

The role of mitochondrial proteomic analysis in radiological accidents and terrorism.  

PubMed

In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring. PMID:22879026

Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

2013-01-01

301

Methodology of Vector Calculus in Regional Development Analysis  

Microsoft Academic Search

In order to examine many phenomena that are subject to economic analysis, one can use methods based on different kinds of\\u000a spaces. Figure 3.1 shows the best known kinds of spaces, and the ones frequently used in economic analyses are given in bold.

inž Kesra Nerm

302

Acid rain research: a review and analysis of methodology  

Microsoft Academic Search

The acidic deposition phenomena, when implicated as a factor potentially responsible for crop and forest yield losses and destruction of aquatic life, has gained increasing attention. The widespread fear that acid rain is having or may have devastating effects has prompted international debates and legislative proposals. An analysis of research on the effects of acid rain, however, reveals serious questions

1983-01-01

303

Finite Mixture Partial Least Squares Analysis: Methodology and Numerical Examples  

Microsoft Academic Search

In wide range of applications for empirical data analysis, the assumption that data is collected from a single homogeneous population is often unrealistic. In particular, the identification of different groups of consumers and their appropriate consideration in partial least squares (PLS) path modeling constitutes a critical issue in marketing. In this work, we introduce a finite mixture PLS software implementation

Christian M. Ringle; Sven Wende; Alexander Will

2010-01-01

304

Vulnerability analysis of interdependent infrastructure systems: A methodological framework  

NASA Astrophysics Data System (ADS)

Infrastructure systems such as power and water supplies make up the cornerstone of modern society which is essential for the functioning of a society and its economy. They become more and more interconnected and interdependent with the development of scientific technology and social economy. Risk and vulnerability analysis of interdependent infrastructures for security considerations has become an important subject, and some achievements have been made in this area. Since different infrastructure systems have different structural and functional properties, there is no universal all-encompassing ‘silver bullet solution’ to the problem of analyzing the vulnerability associated with interdependent infrastructure systems. So a framework of analysis is required. This paper takes the power and water systems of a major city in China as an example and develops a framework for the analysis of the vulnerability of interdependent infrastructure systems. Four interface design strategies based on distance, betweenness, degree, and clustering coefficient are constructed. Then two types of vulnerability (long-term vulnerability and focused vulnerability) are illustrated and analyzed. Finally, a method for ranking critical components in interdependent infrastructures is given for protection purposes. It is concluded that the framework proposed here is useful for vulnerability analysis of interdependent systems and it will be helpful for the system owners to make better decisions on infrastructure design and protection.

Wang, Shuliang; Hong, Liu; Chen, Xueguang

2012-06-01

305

Analysis of self excited induction generator using MATLAB GUI methodology  

Microsoft Academic Search

The steady state analysis of self excited induction generator (SEIG) is vital for proper implementation of induction machine operation as a generator in a stand alone mode through appropriate modeling. This paper presents a user friendly software based solution for complete evaluation of steady-state behavior of SEIG under different operating conditions. The mathematical modeling of the machine is carried out

S. S. Murthy; G. Bhuvaneswari; R. K. Ahuja; S. Gao

2010-01-01

306

Social Judgment Analysis: Methodology for Improving Interpersonal Communication and Understanding.  

ERIC Educational Resources Information Center

Research has found the Social Judgment Analysis (SJA) approach, with its focus on judgment policy and cognitive feedback, to be a significant factor in developing group member agreement and improving member performance. A controlled experiment was designed to assess the relative quality of the judgment making process provided by SJA.…

Rohrbaugh, John; Harmon, Joel

307

Assessment of ISLOCA risk-methodology and application to a combustion engineering plant  

SciTech Connect

Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISOLOCA core damage frequency and risk. This report presents a detailed of description of the application of this analysis methodology to a Combustion Engineering plant.

Kelly, D.L.; Auflick, J.L.; Haney, L.N. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

1992-04-01

308

Transient analysis for thermal margin with COASISO during a severe accident  

SciTech Connect

As an IVR-EVC (in-vessel retention through external vessel cooling) design concept, external cooling of the reactor vessel was suggested to protect the lower head from being overheated due to relocated material from the core during a severe accident. The COASISO (Corium Attack Syndrome Immunization Structure Outside the vessel) adopts an external vessel cooling strategy of flooding the reactor vessel inside the thermal insulator. Its advantage is the quick response time so that the initial heat removal mechanism of the EVC is nucleate boiling from the downward-facing lower head. The efficiency of the COASISO may be estimated by the thermal margin defined as the ratio of the actual heat flux from the reactor vessel to the critical heat flux (CHF). In this study the thermal margin for the large power reactor as the APR1400 (Advanced Power Reactor 1400 MWe) was determined by means of transient analysis for the local condition of the coolant and temperature distributions within the reactor vessel. The heat split fraction in the oxide pool and the metal layer focusing effect were considered during calculation of the angular thermal load at the inner wall of the lower head. The temperature distributions in the reactor vessel resulted in the actual heat flux on the outer wall. The local quality was obtained by solving the simplified transient energy equation. The unheated section of the reactor vessel decreases the thermal margin by mean of the two-dimensional conduction heat transfer. The peak temperature of the reactor vessel was estimated in the film boiling region as the thermal margin was equal to unity. Sensitivity analyses were performed for the time of corium relocation after the reactor trip, the coolant flow rate, and the initial subcooled condition of the coolant. There is no vessel failure predicted at the worst EVC condition when the stratification is not taken into account between the metal layer and the oxidic pool. The present predictive tool may be implemented in the severe accident analysis code like MAAP4 for the external vessel cooling with the COASISO. (authors)

Kim, Chan S.; Chu, Ho S.; Suh, Kune Y.; Park, Goon C.; Lee, Un C. [Seoul National University, San 56-1, Sillim-Dong, Kwanak-Gu, Seoul, 151-742 (Korea, Republic of); Yoon, Ho J. [Purdue University, West Lafayette, IN 47907 (United States)

2002-07-01

309

The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation  

NASA Technical Reports Server (NTRS)

The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

McDanels, Steven J.

2006-01-01

310

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

311

Statistical theory and methodology for remote sensing data analysis  

NASA Technical Reports Server (NTRS)

A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

Odell, P. L.

1974-01-01

312

Methodology of a Cladistic Analysis: How to Construct Cladograms  

NSDL National Science Digital Library

This outline details the six steps necessary for completing a cladistic analysis. The final step is to build the cladogram, following the rules that: all taxa go on the endpoints of the cladogram (never at the nodes), all cladogram nodes must have a list of synapomorphies which are common to all taxa above the nodes, and all synapomorphies appear on the cladogram only once unless the characteristic was derived separately by evolutionary parallelism. The site then explains how to test your cladogram.

2007-05-16

313

The application of a cognitive mapping and user analysis methodology to neighborhood park service area  

E-print Network

CONCLUSIONS Summary of the Findings Discussion Hypothesis One Hypothesis Two Hanagement Implications Evaluation of Service Area for Existing Neighborhood Parks Location of New Neighborhood Parks 54 54 55 55 57 6O 6O 62 REFERENCES APPENDIX A... service area differentiation, The intent of this study is to examine the gnitive mapping and ser analysis methodolog in establishing the service area of a neighborhood park. The methodology specifically addresses two problems. The first is defining...

Mutter, Lawrence Reed

2012-06-07

314

Methodological analysis of finite helical axis behavior in cervical kinematics.  

PubMed

Although a far more stable approach compared to the six degrees of freedom analysis, the finite helical axis (FHA) struggles with interpretational difficulties among health professionals. The analysis of the 3D-motion axis has been used in clinical studies, but mostly limited to qualitative analysis. The aim of this study is to introduce a novel approach for the quantification of the FHA behavior and to investigate the effect of noise and angle intervals on the estimation of FHA parameters. A simulation of body movement has been performed introducing Gaussian noise on position and orientation of a virtual sensor showing linear relation between the simulated noise and the error in the corresponding parameter. FHA axis behavior was determined by calculating the intersection points of the FHA with a number of planes perpendicular to the FHA using the Convex Hull (CH) technique. The angle between the FHA and each of the IHA was also computed and its distribution was also analyzed. Input noise has an inversely proportional relationship with the angle steps of FHA estimation. The proposed FHA quantification approach can be useful to provide new approaches to researchers and to improve insight for the clinician in order to better understand joint kinematics. PMID:24916306

Cescon, Corrado; Cattrysse, Erik; Barbero, Marco

2014-10-01

315

Application of 3D documentation and geometric reconstruction methods in traffic accident analysis: With high resolution surface scanning, radiological MSCT\\/MRI scanning and real data based animation  

Microsoft Academic Search

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact.Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the

Ursula Buck; Silvio Naether; Marcel Braun; Stephan Bolliger; Hans Friederich; Christian Jackowski; Emin Aghayev; Andreas Christe; Peter Vock; Richard Dirnhofer; Michael J. Thali

2007-01-01

316

Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis  

NASA Technical Reports Server (NTRS)

This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

Babcock, P.; Schor, A.; Rosch, G.

1998-01-01

317

An accident analysis of the physical plant of the Agricultural and Mechanical College of Texas  

E-print Network

Month, 46 4. Time Di, stribution of Accidents 49 5. Injuries Pex Salary Level 6. Injuries Pex Age Gxoup 7. Length of Employxnent of Injured 53 57 58 CHAPTER I THE PROBLEM AND DEFINITIONS OF TERMS USED The first objective of an accident... Recommended Practice for Compiling Industrial Accident Causes (New York: American Standards Assocation), Z16. 2. 1. Nature of injury, 2, Location of injury, 3. Day of week, 4. Time of day, 5. Occupation of the injured, 6. Part of the Physical Plant...

Allen, Gary James

2012-06-07

318

Analysis of anticipated transients without scram in severe BWR (Boiling Water Reactor) accidents: Final report  

SciTech Connect

The recent interest in severe accidents in nuclear power plants has changed the focus from design basis events to the consequences of events beyond the design basis. In the General Electric BWRs, introduction of neutron absorbers into the reactor terminates the progression of ATWS events to severe accidents; however, if all neutron absorber systems were assumed to fail, an ATWS could progress to a severe accident. A limiting ATWS scenario has been analyzed which shows that extended periods of time for operator recovery actions are available prior to postulated severe core degradation or containment failure.

Anderson, J.G.M.; Claassen, L.B.; Dua, S.S.; Garrett, J.K.

1987-12-01

319

Routes to failure: analysis of 41 civil aviation accidents from the Republic of China using the human factors analysis and classification system.  

PubMed

The human factors analysis and classification system (HFACS) is based upon Reason's organizational model of human error. HFACS was developed as an analytical framework for the investigation of the role of human error in aviation accidents, however, there is little empirical work formally describing the relationship between the components in the model. This research analyses 41 civil aviation accidents occurring to aircraft registered in the Republic of China (ROC) between 1999 and 2006 using the HFACS framework. The results show statistically significant relationships between errors at the operational level and organizational inadequacies at both the immediately adjacent level (preconditions for unsafe acts) and higher levels in the organization (unsafe supervision and organizational influences). The pattern of the 'routes to failure' observed in the data from this analysis of civil aircraft accidents show great similarities to that observed in the analysis of military accidents. This research lends further support to Reason's model that suggests that active failures are promoted by latent conditions in the organization. Statistical relationships linking fallible decisions in upper management levels were found to directly affect supervisory practices, thereby creating the psychological preconditions for unsafe acts and hence indirectly impairing the performance of pilots, ultimately leading to accidents. PMID:18329391

Li, Wen-Chin; Harris, Don; Yu, Chung-San

2008-03-01

320

Astrolabe: A Collaborative Multi-Perspective Goal-Oriented Risk Analysis Methodology  

Microsoft Academic Search

The intention of this paper is to introduce a risk analysis methodology called Astrolabe. Astrolabe is based on causal analysis of systems risks. It allows the analysts to both align the current standpoint of the system with its intentions and identify any vulnerabilities or hazards that threaten the systems stability. Astrolabe adopts concepts from organizational theory and software requirement engineering.

Ebrahim Bagheri; Ali A. Ghorbani

321

Sensitivity Analysis for Biometric Systems: A Methodology Based on Orthogonal Experiment Designs.  

National Technical Information Service (NTIS)

The purpose of this paper is to introduce an effective and structured methodology for carrying out a biometric system sensitivity analysis. The goal of sensitivity analysis is to provide the researcher/developer with the insight and understanding of the k...

J. J. Filliben, P. J. Phillips, R. J. Micheals, Y. Lee

2012-01-01

322

Methodology, Metrics and Measures for Testing and Evaluation of Intelligence Analysis Tools  

E-print Network

PNWD-3550 Methodology, Metrics and Measures for Testing and Evaluation of Intelligence Analysis and Measures for Testing and Evaluation of Intelligence Analysis Tools 1. Introduction The intelligence, stakeholders and the research community have been seeking technology-based solutions to reduce the analyst

323

An analysis of USSPACECOM's space surveillance network sensor tasking methodology  

NASA Astrophysics Data System (ADS)

This study provides the basis for the development of a cost/benefit assessment model to determine the effects of alterations to the Space Surveillance Network (SSN) on orbital element (OE) set accuracy. It provides a review of current methods used by NORAD and the SSN to gather and process observations, an alternative to the current Gabbard classification method, and the development of a model to determine the effects of observation rate and correction interval on OE set accuracy. The proposed classification scheme is based on satellite J2 perturbations. Specifically, classes were established based on mean motion, eccentricity, and inclination since J2 perturbation effects are functions of only these elements. Model development began by creating representative sensor observations using a highly accurate orbital propagation model. These observations were compared to predicted observations generated using the NORAD Simplified General Perturbation (SGP4) model and differentially corrected using a Bayes, sequential estimation, algorithm. A 10-run Monte Carlo analysis was performed using this model on 12 satellites using 16 different observation rate/correction interval combinations. An ANOVA and confidence interval analysis of the results show that this model does demonstrate the differences in steady state position error based on varying observation rate and correction interval.

Berger, Jeff M.; Moles, Joseph B.; Wilsey, David G.

1992-12-01

324

Integrated modeling and analysis methodology for precision pointing applications  

NASA Astrophysics Data System (ADS)

Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

Gutierrez, Homero L.

2002-07-01

325

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analyses is presented. New thermal finite elements which yield exact nodal and element temperature for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal-structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

326

A faster reactor transient analysis methodology for PCs  

SciTech Connect

The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report.

Ott, K.O. (Purdue Univ., Lafayette, IN (United States). School of Nuclear Engineering)

1991-10-01

327

[The Polish helicopter crash in Belarus--an analysis of the accident].  

PubMed

The subject of analysis was the crash of a helicopter of the Polish Border Guards, which happened on October 31, 2009, in the Byelorussian territory about two hundred meters from the Polish border. In the accident, three crew members perished: the pilot, navigator and operator. Based on the accounts obtained directly after the crash on the site of the tragedy, it was established that the pilot tried to land, but the impact was so strong that the aircraft sank about one meter into the ground. On November 3, 2009, a committee consisting of two prosecutors from the County Prosecutor Office in Bialystok, a forensic science expert and a representative from the Border Guards, went to Department of Forensic Medicine in Brzesc. The prosecutors and forensic science expert took part in recovering the bodies. During the process of internal and external examination, severe body injuries were noted, without any surviving tissue and intestines. Samples of blood, urine and fragments of internal organs were collected for chemical, biochemical, toxicological and histopathological examinations. Muscle DNA was also taken. PMID:21520535

Ptaszy?ska-Sarosiek, Iwona; Niemcunowicz-Janica, Anna; Ok?ota, Magdalena; Wardaszka, Zofia; Szeremeta, Micha?; Filimoniuk, Marcin; Janica, Jerzy

2010-01-01

328

Radiological health effects models for nuclear power plants accident consequence analysis; An update (1990)  

Microsoft Academic Search

The U.S. Nuclear Regulatory Commission revised the health effects models that provide the basis for assessing health risk associated with nuclear power plant accidents. In this paper the revised health effects models are briefly summarized.

1991-01-01

329

Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR  

SciTech Connect

In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed future analyses, particularly concerning: the neutronic burn-up scheme, the modeling of the diagrid effect and the control rod drive line expansion feed-backs, as well as the primary/secondary systems thermal-hydraulics behavior. (authors)

Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D. [EDF R and D, 1, Avenue du General de Gaulle, 92141 Clamart (France); Struwe, D.; Pfrang, W.; Ponomarev, A. [Karlsruher Institut fuer Technologie KIT, Institut fuer Neutronenphysik und Reaktortechnik INR, Hermann-von-Helmholtz-Platz 1, Gebaude 521, 76344 Eggenstein-Leopoldshafen (Germany)

2012-07-01

330

Analysis of the Chernobyl accident from 1:19:00 to the first power excursion  

Microsoft Academic Search

Many researchers have reported that the root cause of the Chernobyl accident has not been clarified still now. Since many of them discussed the accident without a precise thermal-hydraulic investigation, thermal-hydraulic calculations coupled with neutronic calculations have been done on the basis of the recorded result at the Chernobyl Unit-4. Plant configurations and operational conditions were given to the code

Hiroyasu Mochizuki

2007-01-01

331

Posttraumatic Stress Disorder: Diagnostic Data Analysis by Data Mining Methodology  

PubMed Central

Aim To use data mining methods in assessing diagnostic symptoms in posttraumatic stress disorder (PTSD) Methods The study included 102 inpatients: 51 with a diagnosis of PTSD and 51 with psychiatric diagnoses other than PTSD. Several models for predicting diagnosis were built using the random forest classifier, one of the intelligent data analysis methods. The first prediction model was based on a structured psychiatric interview, the second on psychiatric scales (Clinician-administered PTSD Scale – CAPS, Positive and Negative Syndrome Scale – PANSS, Hamilton Anxiety Scale – HAMA, and Hamilton Depression Scale – HAMD), and the third on combined data from both sources. Additional models placing more weight on one of the classes (PTSD or non-PTSD) were trained, and prototypes representing subgroups in the classes constructed. Results The first model was the most relevant for distinguishing PTSD diagnosis from comorbid diagnoses such as neurotic, stress-related, and somatoform disorders. The second model pointed out the scores obtained on the Clinician-administered PTSD Scale (CAPS) and additional Positive and Negative Syndrome Scale (PANSS) scales, together with comorbid diagnoses of neurotic, stress-related, and somatoform disorders as most relevant. In the third model, psychiatric scales and the same group of comorbid diagnoses were found to be most relevant. Specialized models placing more weight on either the PTSD or non-PTSD class were able to better predict their targeted diagnoses at some expense of overall accuracy. Class subgroup prototypes mainly differed in values achieved on psychiatric scales and frequency of comorbid diagnoses. Conclusion Our work demonstrated the applicability of data mining methods for the analysis of structured psychiatric data for PTSD. In all models, the group of comorbid diagnoses, including neurotic, stress-related, and somatoform disorders, surfaced as important. The important attributes of the data, based on the structured psychiatric interview, were the current symptoms and conditions such as presence and degree of disability, hospitalizations, and duration of military service during the war, while CAPS total scores, symptoms of increased arousal, and PANSS additional criteria scores were indicated as relevant from the psychiatric symptom scales. PMID:17436383

Marinic, Igor; Supek, Fran; Kovacic, Zrnka; Rukavina, Lea; Jendricko, Tihana; Kozaric-Kovacic, Dragica

2007-01-01

332

Methodological and computational considerations for multiple correlation analysis.  

PubMed

The squared multiple correlation coefficient has been widely employed to assess the goodness-of-fit of linear regression models in many applications. Although there are numerous published sources that present inferential issues and computing algorithms for multinormal correlation models, the statistical procedure for testing substantive significance by specifying the nonzero-effect null hypothesis has received little attention. This article emphasizes the importance of determining whether the squared multiple correlation coefficient is small or large in comparison with some prescribed standard and develops corresponding Excel worksheets that facilitate the implementation of various aspects of the suggested significance tests. In view of the extensive accessibility of Microsoft Excel software and the ultimate convenience of general-purpose statistical packages, the associated computer routines for interval estimation, power calculation, a nd samplesize determination are alsoprovided for completeness. The statistical methods and available programs of multiple correlation analysis described in this article purport to enhance pedagogical presentation in academic curricula and practical application in psychological research. PMID:18183885

Shieh, Gwowen; Kung, Cmen-Feng

2007-11-01

333

An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.  

SciTech Connect

This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

2003-09-01

334

Nasal continuous positive airway pressure (nCPAP) treatment for obstructive sleep apnea, road traffic accidents and driving simulator performance: a meta-analysis.  

PubMed

We used meta-analysis to synthesize current evidence regarding the effect of nasal continuous positive airway pressure (nCPAP) on road traffic accidents in patients with obstructive sleep apnea (OSA) as well as on their performance in driving simulator. The primary outcomes were real accidents, near miss accidents, and accident-related events in the driving simulator. Pooled odds ratios (ORs), incidence rate ratios (IRRs) and standardized mean differences (SMDs) were appropriately calculated through fixed or random effects models after assessing between-study heterogeneity. Furthermore, risk differences (RDs) and numbers needed to treat (NNTs) were estimated for real and near miss accidents. Meta-regression analysis was performed to examine the effect of moderator variables and publication bias was also evaluated. Ten studies on real accidents (1221 patients), five studies on near miss accidents (769 patients) and six studies on the performance in driving simulator (110 patients) were included. A statistically significant reduction in real accidents (OR=0.21, 95% CI=0.12-0.35, random effects model; IRR=0.45, 95% CI=0.34-0.59, fixed effects model) and near miss accidents (OR=0.09, 95% CI=0.04-0.21, random effects model; IRR=0.23, 95% CI=0.08-0.67, random effects model) was observed. Likewise, a significant reduction in accident-related events was observed in the driving simulator (SMD=-1.20, 95% CI=-1.75 to -0.64, random effects). The RD for real accidents was -0.22 (95% CI=-0.32 to -0.13, random effects), with NNT equal to five patients (95% CI=3-8), whereas for near miss accidents the RD was -0.47 (95% CI=-0.69 to -0.25, random effects), with NNT equal to two patients (95% CI=1-4). For near miss accidents, meta-regression analysis suggested that nCPAP seemed more effective among patients entering the studies with higher baseline accident rates. In conclusion, all three meta-analyses demonstrated a sizeable protective effect of nCPAP on road traffic accidents, both in real life and virtual environment. PMID:21195643

Antonopoulos, Constantine N; Sergentanis, Theodoros N; Daskalopoulou, Styliani S; Petridou, Eleni Th

2011-10-01

335

Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project  

SciTech Connect

This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J. [Bechtel National, Inc., San Francisco, CA (United States); Laub, T.W. [Sandia National Labs., Albuquerque, NM (United States)

1992-06-01

336

Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.  

PubMed

Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future researchers. PMID:19217180

Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

2009-08-31

337

Immunoassay Methods and their Applications in Pharmaceutical Analysis: Basic Methodology and Recent Advances  

PubMed Central

Immunoassays are bioanalytical methods in which the quantitation of the analyte depends on the reaction of an antigen (analyte) and an antibody. Immunoassays have been widely used in many important areas of pharmaceutical analysis such as diagnosis of diseases, therapeutic drug monitoring, clinical pharmacokinetic and bioequivalence studies in drug discovery and pharmaceutical industries. The importance and widespread of immunoassay methods in pharmaceutical analysis are attributed to their inherent specificity, high-throughput, and high sensitivity for the analysis of wide range of analytes in biological samples. Recently, marked improvements were achieved in the field of immunoassay development for the purposes of pharmaceutical analysis. These improvements involved the preparation of the unique immunoanalytical reagents, analysis of new categories of compounds, methodology, and instrumentation. The basic methodologies and recent advances in immunoassay methods applied in different fields of pharmaceutical analysis have been reviewed. PMID:23674985

Darwish, Ibrahim A.

2006-01-01

338

Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2  

Microsoft Academic Search

This report is a revision of NUREG\\/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG\\/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects

J. S. Evans; S. Abrahmson; M. A. Bender; B. B. Boecker; B. R. Scott; E. S. Gilbert

1993-01-01

339

Analysis of the effectiveness of emergency countermeasures in the 30-KM zone during the early phase of the chernobyl accident  

SciTech Connect

Some radiation-emergency countermeasures, including evacuation, were implemented in the settlements of the 30-km zone during the early phase of the accident at the Chernobyl Nuclear Power Plant. These countermeasures are described and compared with the international recommendations. An analysis of the effectiveness of the emergency countermeasures was conducted based upon the results of a wide-scale public survey. Quantitative assessments of the effectiveness (dose reduction) of the countermeasures were derived. 9 refs., 2 figs.

Likhtarev, I.A.; Chumack, V.V.; Repin, V.S. [Ukrainian Scientific Center of Radiation Medicine, Kiev (Ukraine)

1994-11-01

340

Accident liability.  

PubMed Central

The idea of accident proneness, which originated in the early 1900s, has proved to be ineffectual as an operational concept. Discrete econometric methods may be useful to find out which factors are at work in the process that leads to accidents and whether there are individuals who are more liable to accidents than others. PMID:3986144

Kune, J B

1985-01-01

341

The epidemiology and cost analysis of patients presented to Emergency Department following traffic accidents  

PubMed Central

Background Traffic accidents are ranked first as the cause of personal injury throughout the world. The high number of traffic accidents yielding injuries and fatalities makes them of great importance to Emergency Departments. Material/Methods Patients admitted to Hacettepe University Faculty of Medicine Adult Emergency Department due to traffic accidents were investigated epidemiologically. Differences between groups were evaluated by Kruskall-Wallis, Mann-Whitney, and Wilcoxon tests. A value of p<0.05 was accepted as statistically significant. Results We included 2003 patients over 16 years of age. The mean age was 39.6±16.1 and 55% were males. Admissions by ambulance and due to motor vehicle accidents were the most common. In 2004 the rate of traffic accidents (15.3%) was higher than the other years, the most common month was May (10.8%), and the most common time period was 6 pm to 12 am (midnight). About half of the patients (51.5%) were admitted in the first 30 minutes. Life-threatening condition was present in 9.6% of the patients. Head trauma was the most common type of trauma, with the rate of 18.3%. Mortality rate was 81.8%. The average length of hospital stay was 403 minutes (6.7 hours) and the average cost per patient was 983±4364 TL. Conclusions Further studies are needed to compare the cost found in this study with the mean cost for Turkey. However, the most important step to reduce the direct and indirect costs due to traffic accidents is the prevention of these accidents. PMID:24316815

Karadana, Gokce Akgul; Aksu, Nalan Metin; Akkas, Meltem; Akman, Canan; Uzumcugil, Ak?n; Ozmen, M. Mahir

2013-01-01

342

Accident investigation  

NASA Technical Reports Server (NTRS)

The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

Laynor, William G. Bud

1987-01-01

343

WASTE-ACC: A computer model for analysis of waste management accidents  

SciTech Connect

In support of the U.S. Department of Energy`s (DOE`s) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives.

Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

1996-12-01

344

Analysis of 121 fatal passenger car-adult pedestrian accidents in China.  

PubMed

To study the characteristics of fatal vehicle-pedestrian accidents in China?a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

2014-10-01

345

Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models  

SciTech Connect

The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

S. A. Eide; D. M. Rasmuson; C. L. Atwood

2008-09-01

346

An analysis of accident experience at entrance ramps within construction work zones at long-term freeway reconstruction projects in Texas  

E-print Network

AN ANALYSIS OF ACCIDENT EXPERIENCE AT ENTRANCE RAMPS WITHIN CONSTRUCTION WORK ZONES AT LONG-TERM FREEWAY RECONSTRUCTION PROJECTS IN TEXAS A Thesis by DAVID BRYAN CASTEEL Submitted to the Office of Graduate Studies of Texas A&M University... in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE August 1991 Major Subject: Civil Engineering AN ANALYSIS OF ACCIDENT EXPERIENCE AT ENTRANCE RAMPS WITHIN CONSTRUCTION WORK ZONES AT LONG-TERM FREEWAY RECONSTRUCTION PROJECTS...

Casteel, David Bryan

2012-06-07

347

Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.  

PubMed

The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. PMID:21608109

Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

2011-07-01

348

Human-Automated Judge Learning: A Methodology for Examining Human Interaction With Information Analysis Automation  

Microsoft Academic Search

Human-automated judge learning (HAJL) is a methodology providing a three-phase process, quantitative measures, and analytical methods to support design of information analysis automation. HAJL's measures capture the human and automation's judgment processes, relevant features of the environment, and the relationships between each. Specific measures include achievement of the human and the automation, conflict between them, compromise and adaptation by the

Ellen J. Bass; Amy R. Pritchett

2008-01-01

349

Integrated cation–anion\\/volatile fluid inclusion analysis by gas and ion chromatography; methodology and examples  

Microsoft Academic Search

Combined gas and ion chromatographic analysis of well characterized, small (?1 g) fluid inclusion-bearing samples is a powerful, but simple, means for obtaining integrated fluid concentrations of major and trace, volatile and ionic fluid constituents without using microthermometrically determined salinity for normalization. The methodology, which is described and assessed in detail, involves crushing a carefully cleaned sample at ?105°C in

D. M. DeR Channer; C. J Bray; E. T. C Spooner

1999-01-01

350

A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining  

E-print Network

A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining Jiang Bian, Josh M Informatics Brain Imaging Research Center, Psychiatric Research Institute University of Arkansas for Medical and functional brain connectivity networks and has helped researchers conceive the effects of neurological

Xie, Mengjun

351

A Geo-economic Methodology for Decision Analysis for Mineral Deposits  

Microsoft Academic Search

This paper presents a decision analysis methodology that can be applied at the early exploratory stages of the mineral projects which present some future expectations of constituting a mine. Based upon a conceptual mining project, a simplified cash flow is generated through a characterization of project variables such as metal prices, discount rate, cost, investment and metallurgical recovery, related cutoff

Miguel Antonio; Cedraz Nery; Saul B. Suslick

352

Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys  

Cancer.gov

Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

353

Advanced Research in Artificial Intelligence132 METHODOLOGY FOR LANGUAGE ANALYSIS AND GENERATION  

E-print Network

Advanced Research in Artificial Intelligence132 METHODOLOGY FOR LANGUAGE ANALYSIS AND GENERATION the text translation pertaining to specific thematic fields (technical manuals, weather forecast, reports components, with the additional challenge that being an artificial language it has to be as expressive

Cardeñosa, Jesús

354

Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design  

ERIC Educational Resources Information Center

Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

Tajino, Akira; James, Robert; Kijima, Kyoichi

2005-01-01

355

Success story in software engineering using NIAM (Natural language Information Analysis Methodology)  

SciTech Connect

To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

Eaton, S.M.; Eaton, D.S.

1995-10-01

356

Methodology of NEQ (f) analysis for optimization and comparison of digital breast tomosynthesis acquisition techniques  

E-print Network

Methodology of NEQ (f) analysis for optimization and comparison of digital breast tomosynthesis-dimensional imaging technique, digital breast tomosynthesis allows the reconstruction of an arbitrary set of planes in the breast from a limited-angle series of projection images. Though several tomosynthesis algorithms have

Chen, Ying "Ada"

357

A comparison of segmental and wrist-to-ankle methodologies of bioimpedance analysis  

Microsoft Academic Search

The common approach of bioelectrical impedance analysis to estimate body water uses a wrist-to-ankle methodology which, although not indicated by theory, has the advantage of ease of application particularly for clinical studies involving patients with debilitating diseases. A number of authors have suggested the use of a segmented protocol in which the impedances of the trunk and limbs are measured

B. J. Thomas; B. H. Cornish; L. C. Ward; M. A. Patterson

1998-01-01

358

A Value-Event Path Model Based Value Engineering Analysis Methodology  

Microsoft Academic Search

Value based management (VBM) aims to promote value of an enterprise, and value engineering (VE) aims to promote value by least cost. Combining these two ideas, this paper proposes a value-event path model based value engineering analysis methodology. In a management process, value of an enterprise is changed dynamically and the change of the value corresponds to an event which

Fenglin Peng; Xuping Jiang

2009-01-01

359

Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.  

SciTech Connect

Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

Salay, Michael (United States Nuclear Regulatory Commission, Washington, D.C.); Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

2008-10-01

360

Analysis of loss-of-coolant and loss-of-flow accidents in the first wall cooling system of NET/ITER  

NASA Astrophysics Data System (ADS)

This paper presents the thermal-hydraulic analysis of potential accidents in the first wall cooling system of the Next European Torus or the International Thermonuclear Experimental Reactor. Three ex-vessel loss-of-coolant accidents, two in-vessel loss-of-coolant accidents, and three loss-of-flow accidents have been analyzed using the thermal-hydraulic system analysis code RELAP5/MOD3. The analyses deal with the transient thermal-hydraulic behavior inside the cooling systems and the temperature development inside the nuclear components during these accidents. The analysis of the different accident scenarios has been performed without operation of emergency cooling systems. The results of the analyses indicate that a loss of forced coolant flow through the first wall rapidly causes dryout in the first wall cooling pipes. Following dryout, melting in the first wall starts within about 130 s in case of ongoing plasma burning. In case of large break LOCAs and ongoing plasma burning, melting in the first wall starts about 90 s after accident initiation.

Komen, E. M. J.; Koning, H.

1994-03-01

361

Cost analysis and financial risk profile for severe reactor accidents at Waterford-3  

SciTech Connect

To support Louisiana Power and Light Company (LP and L) in determining an appropriate level of nuclear property insurance for Waterford Steam Electric Station, Unit 3 (Waterford-3), ABZ, Incorporated, performed a series of cost analyses and developed a financial risk profile. This five-month study, conducted in 1991, identified the potential Waterford-3 severe reactor accidents and described each from a cleanup perspective, estimated the cost and schedule to cleanup from each accident, developed a probability distribution of associated financial exposure, and developed a profile of financial risk as a function of insurance coverage.

Cutbush, J.D.; Abbott, E.C. (ABZ, Inc., Chantilly, VA (United States)); Carpenter, W.L. Jr. (Louisiana Power and Light Co., New Orleans (United States))

1992-01-01

362

What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?  

PubMed

Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such as in-depth accident investigations and pre-crash data recordings. PMID:23314359

Tivesten, Emma; Wiberg, Henrik

2013-03-01

363

Task Analysis Based Methodology for the Design of Face to Face Computer Supported Collaborative Learning Activities  

Microsoft Academic Search

\\u000a This paper shows how Task Analysis can be a powerful tool for the design of collaborative applications supported by wirelessly\\u000a interconnected handhelds. We define a methodology for the design of such activities. It basically consists in performing a\\u000a Task Analysis on an Interaction Model to obtain the set of all possible interactions between actors. Then a class of activities\\u000a is

Maria Francisca Capponi; Miguel Nussbaum; María Ester Lagos

2006-01-01

364

Methodology of a combined ground based testing and numerical modelling analysis of supersonic combustion flow paths  

NASA Astrophysics Data System (ADS)

In the framework of the European Commission co-funded LAPCAT (Long-Term Advanced Propulsion Concepts and Technologies) project, the methodology of a combined ground-based testing and numerical modelling analysis of supersonic combustion flow paths was established. The approach is based on free jet testing of complete supersonic combustion ramjet (scramjet) configurations consisting of intake, combustor and nozzle in the High Enthalpy Shock Tunnel Göttingen (HEG) of the German Aerospace Center (DLR) and computational fluid dynamics studies utilising the DLR TAU code. The capability of the established methodology is demonstrated by applying it to the flow path of the generic HyShot II scramjet flight experiment configuration.

Hannemann, Klaus; Karl, Sebastian; Martinez Schramm, Jan; Steelant, Johan

2010-10-01

365

Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.  

ERIC Educational Resources Information Center

Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

Dunwoody, Sharon; And Others

366

Health effects models for nuclear power plant accident consequence analysis: Low LET radiation  

Microsoft Academic Search

This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and

1990-01-01

367

Radiological health effects models for nuclear power plant accident consequence analysis  

Microsoft Academic Search

Improved health effects models have been developed for assessing the early effects, late somatic effects and genetic effects that might result from low-LET radiation exposures to populations following a major accident in a nuclear power plant. All the models have been developed in such a way that the dynamics of population risks can be analyzed. Estimates of life years lost

John S. Evans; Dade W. Moeller

1989-01-01

368

PPCS thermal analysis of bounding accident scenarios using improved computational modelling  

Microsoft Academic Search

Within the framework of the European Power Plant Conceptual Study (PPCS), evaluation of activation inventories and temperature excursions in structures following hypothetical worst-case accident scenarios was performed for the four plant models considered. An improved, three-dimensional computational tool was developed and extensively used to assist in the safety and environmental assessment of the PPCS power plant models. The tool allows

R. Pampin; P. J. Karditsas; M. J. Loughlin; N. P. Taylor

2006-01-01

369

Traffic accident in Cuiabá-MT: an analysis through the data mining technology.  

PubMed

The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males. PMID:20841739

Galvão, Noemi Dreyer; de Fátima Marin, Heimar

2010-01-01

370

Analytical simulation and PROFAT II: a new methodology and a computer automated tool for fault tree analysis in chemical process industries.  

PubMed

Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper. PMID:10828384

Khan, F I; Abbasi, S A

2000-07-10

371

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)  

NASA Technical Reports Server (NTRS)

Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

Johnson, C. W.; Holloway, C, M.

2007-01-01

372

Radiotherapy Accidents  

NASA Astrophysics Data System (ADS)

A major benefit of a Quality Assurance system in a radiotherapy centre is that it reduces the likelihood of an accident. For over 20 years I have been the interface in the UK between the Institute of Physics and Engineering in Medicine and the media — newspapers, radio and TV — and so I have learned about radiotherapy accidents from personal experience. In some cases, these accidents did not become public and so the hospital cannot be identified. Nevertheless, lessons are still being learned.

Mckenzie, Alan

373

Thermal-hydraulic analysis for changing feedwater check valve leakage rate testing methodology  

SciTech Connect

The current design and testing requirements for the feedwater check valves (FWCVs) at the Grand Gulf Nuclear Station are established from original licensing requirements that necessitate extremely restrictive air testing with tight allowable leakage limits. As a direct result of these requirements, the original high endurance hard seats in the FWCVs were modified with elastomeric seals to provide a sealing surface capable of meeting the stringent air leakage limits. However, due to the relatively short functional life of the elastomeric seals compared to the hard seats, the overall reliability of the sealing function actually decreased. This degraded performance was exhibited by frequent seal failures and subsequent valve repairs. The original requirements were based on limited analysis and the belief that all of the high energy feedwater vaporized during the LOCA blowdown. These phenomena would have resulted in completely voided feedwater lines and thus a steam environment within the feedwater leak pathway. To challenge these criteria, a comprehensive design basis accident analysis was developed using the RELAP5/MOD3.1 thermal-hydraulic code. Realistic assumptions were used to more accurately model the post-accident fluid conditions within the feedwater system. The results of this analysis demonstrated that no leak path exists through the feedwater lines during the reactor blowdown phase and that sufficient subcooled water remains in various portions of the feedwater piping to form liquid water loop seals that effectively isolate this leak path. These results provided the bases for changing the leak testing requirements of the FWCVs from air to water. The analysis results also established more accurate allowable leakage limits, determined the real effective margins associated with the FWCV safety functions, and led to design changes that improved the overall functional performance of the valves.

Fuller, R.; Harrell, J.

1996-12-01

374

Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis.  

National Technical Information Service (NTIS)

With limited maintenance dedicated to aging dam spillway gate structures, there is an increased risk of gate inoperability and corresponding dam failure due to malfunction or inadequate design. This report summarizes research on methodologies to assist in...

R. C. Patev, C. Putcha, S. D. Foltz

2005-01-01

375

Development and exploration of a new methodology for the fitting and analysis of XAS data  

PubMed Central

A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ?), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2?, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120

Delgado-Jaime, Mario Ulises; Kennepohl, Pierre

2010-01-01

376

Risk of road accident associated with the use of drugs: a systematic review and meta-analysis of evidence from epidemiological studies.  

PubMed

This paper is a corrigendum to a previously published paper where errors were detected. The errors have been corrected in this paper. The paper is otherwise identical to the previously published paper. A systematic review and meta-analysis of studies that have assessed the risk of accident associated with the use of drugs when driving is presented. The meta-analysis included 66 studies containing a total of 264 estimates of the effects on accident risk of using illicit or prescribed drugs when driving. Summary estimates of the odds ratio of accident involvement are presented for amphetamines, analgesics, anti-asthmatics, anti-depressives, anti-histamines, benzodiazepines, cannabis, cocaine, opiates, penicillin and zopiclone (a sleeping pill). For most of the drugs, small or moderate increases in accident risk associated with the use of the drugs were found. Information about whether the drugs were actually used while driving and about the doses used was often imprecise. Most studies that have evaluated the presence of a dose-response relationship between the dose of drugs taken and the effects on accident risk confirm the existence of a dose-response relationship. Use of drugs while driving tends to have a larger effect on the risk of fatal and serious injury accidents than on the risk of less serious accidents (usually property-damage-only accidents). The quality of the studies that have assessed risk varied greatly. There was a tendency for the estimated effects of drug use on accident risk to be smaller in well-controlled studies than in poorly controlled studies. Evidence of publication bias was found for some drugs. The associations found cannot be interpreted as causal relationships, principally because most studies do not control very well for potentially confounding factors. PMID:22785089

Elvik, Rune

2013-11-01

377

Occupational accidents aboard merchant ships  

PubMed Central

Objectives: To investigate the frequency, circumstances, and causes of occupational accidents aboard merchant ships in international trade, and to identify risk factors for the occurrence of occupational accidents as well as dangerous working situations where possible preventive measures may be initiated. Methods: The study is a historical follow up on occupational accidents among crew aboard Danish merchant ships in the period 1993–7. Data were extracted from the Danish Maritime Authority and insurance data. Exact data on time at risk were available. Results: A total of 1993 accidents were identified during a total of 31 140 years at sea. Among these, 209 accidents resulted in permanent disability of 5% or more, and 27 were fatal. The mean risk of having an occupational accident was 6.4/100 years at sea and the risk of an accident causing a permanent disability of 5% or more was 0.67/100 years aboard. Relative risks for notified accidents and accidents causing permanent disability of 5% or more were calculated in a multivariate analysis including ship type, occupation, age, time on board, change of ship since last employment period, and nationality. Foreigners had a considerably lower recorded rate of accidents than Danish citizens. Age was a major risk factor for accidents causing permanent disability. Change of ship and the first period aboard a particular ship were identified as risk factors. Walking from one place to another aboard the ship caused serious accidents. The most serious accidents happened on deck. Conclusions: It was possible to clearly identify work situations and specific risk factors for accidents aboard merchant ships. Most accidents happened while performing daily routine duties. Preventive measures should focus on workplace instructions for all important functions aboard and also on the prevention of accidents caused by walking around aboard the ship. PMID:11850550

Hansen, H; Nielsen, D; Frydenberg, M

2002-01-01

378

General Methodology Combining Engineering Optimization of Primary HVAC & R Plants with Decision Analysis Methods--Part I: Deterministic Analysis  

SciTech Connect

This paper is the first of a two-part sequence that proposes a general methodology for dynamic scheduling and optimal control of complex primary HVAC&R plants, which combines engineering analyses within a practical decision analysis framework by modeling risk attitudes of the operator. The paper was based on work done prior to employment by Battelle.

Jiang, Wei; Reddy, T. A.

2007-01-31

379

Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

1997-01-01

380

Reliability analysis of repairable systems using Petri nets and vague Lambda-Tau methodology.  

PubMed

The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. PMID:22789401

Garg, Harish

2013-01-01

381

Small-break loss of coolant accident analysis of a nuclear power plant  

Microsoft Academic Search

The postulated small-break loss-of-coolant accident (SBLOCA) occurs as a consequence of the rupture of the reactor primary flow path, and it poses a threat to the safety of nuclear power plants. Many methods have been developed over the years to study the system response under such a situation, and one of the methods is to perform the simulations on quantified

P. Salim; Y. A. Hassan

1989-01-01

382

Accident simulation and consequence analysis in support of MHTGR safety evaluations  

SciTech Connect

This paper summarizes research performed at Oak Ridge National Laboratory (ORNL) to assist the Nuclear Regulatory Commission (NRC) in preliminary determinations of licensability of the US Department of Energy (DOE) reference design of a standard modular high-temperature gas-cooled reactor (MHTGR). The work described includes independent analyses of core heatup and steam ingress accidents, and the reviews and analyses of fuel performance and fission product transport technology.

Ball, S.J.; Wichner, R.P.; Smith, O.L.; Conklin, J.C. (Oak Ridge National Lab., TN (United States)); Barthold, W.P. (Barthold Associates, Inc., Knoxville, TN (United States))

1991-01-01

383

Experimental analysis of the aqueous chemical environment following a loss-of-coolant accident  

Microsoft Academic Search

Five different 30-day tests were conducted to simulate the chemical environment in a pressurized water reactor containment water pool after a loss-of-coolant accident (LOCA). All chemical environments within the test apparatus included boric acid, lithium hydroxide, and hydrochloric acid. In addition, trisodium phosphate, sodium hydroxide, and sodium tetraborate were used to control pH in separate tests. Materials tested within this

Dong Chen; Kerry J. Howe; Jack Dallman; Bruce C. Letellier; Marc Klasky; Janet Leavitt; Bhagwat Jain

2007-01-01

384

Thermodynamic analysis of spent pyrochemical salts in the stored condition and in viable accident scenarios  

SciTech Connect

This study involves examining ``spent`` electrorefining (ER) salts in the form present after usage (as stored), and then after exposure to water in a proposed accident scenario. Additionally, the equilibrium composition of the salt after extended exposure to air was also calculated by computer modeling and those results are also presented herein. It should be noted that these salts are extremely similar to spent MSE salts from the Rocky Flats MSE campaigns using NaCl-KCl- MgCl{sub 2}.

Axler, K.M.

1994-03-01

385

An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident  

SciTech Connect

An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

El-Genk, M.S.; Paramonov, D. (Institute for Space Nuclear Power Studies, Department of Chemical and Nuclear Engineering, The University of New Mexico, Albuquerque, New Mexico 87131 (United States))

1993-01-10

386

Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report  

SciTech Connect

This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

1997-06-01

387

A simple methodology for flood scenario simulations and flood vulnerability analysis  

NASA Astrophysics Data System (ADS)

Flood disasters are recognized as one of most important sources of economic losses and casualties worldwide. Nowadays, flood risk management is gaining importance in order to mitigate and prevent flood disasters, and consequently the analysis of flood vulnerability is becoming a research topic. In this work, we propose a simple methodology for large-scale analysis of flood vulnerability. A gis-based index is used to simulate a series of flood scenarios and obtain information in term of flooded areas and expected water depth. Using index results and damage curves in each flood scenario, the expected direct and indirect damages are spatially aggregated over the area of interest to obtain areal damage curves, which synthesize vulnerability in the area of interest. The method is applied in a real flood event, to evaluate the index performance and test the methodology.

Dottori, Francesco; Martina, Mario L. V.

2013-04-01

388

Methodology for the characterization of water quality: Analysis of time-dependent variability  

NASA Astrophysics Data System (ADS)

The general methodology for characterization of water quality here presented was applied, after elimination of spatial effects, to the analysis of time-dependent variability of physico-chemical parameters measured, on eighteen dates, during the summer months of 1976, at 112 sampling stations on the Saint Lawrence River between Cornwall and Quebec City. Two aspects of water utilization are considered: domestic water-supply and capacity to sustain balanced aquatic life. The methodology, based on use and adaptation of classical multivariate statistical methods (correspondence analysis, hierarchical classification), leads, for a given type of water utilization, to the determination of the most important parameters, of their essential interrelations and shows the relative importance of their variations. Rationalization of network operations is thus obtained through identification of homogeneous behaviour periods as well as of critical dates for the measurement of parameters characterizing a given use.

Lachance, Marius; Bobée, Bernard

1982-11-01

389

Nuclear accidents  

NSDL National Science Digital Library

Accidents at nuclear power plants can be especially devastating to people and the environment. This article, part of a series about the future of energy, introduces students to nuclear accidents at Chernobyl, Three Mile Island, and Tokaimura. Students explore the incidents by examining possible causes, environmental impacts, and effects on life.

Project, Iowa P.

2004-01-01

390

Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis  

PubMed Central

Introduction Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis. PMID:22549283

Harpaz, Rave; DuMouchel, William; Shah, Nigam H.; Madigan, David; Ryan, Patrick; Friedman, Carol

2013-01-01

391

Identification of multibody vehicle models for crash analysis using an optimization methodology  

Microsoft Academic Search

This work proposes an optimization methodology for the identification of realistic multibody vehicle models, based on the\\u000a plastic hinge approach, for crash analysis. The identification of the design variables and the objective function and constraints\\u000a are of extreme importance for the success of the optimization. The characteristics of the plastic hinges are used as design\\u000a variables while the objective functions

Marta Carvalho; Jorge Ambrósio

2010-01-01

392

Methodology for the analysis of pollutant emissions from a city bus  

NASA Astrophysics Data System (ADS)

In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

Armas, Octavio; Lapuerta, Magín; Mata, Carmen

2012-04-01

393

[Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems  

NASA Technical Reports Server (NTRS)

The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

Hermann, Robert

1997-01-01

394

Socio-economic Value Analysis in Geospatial and Earth Observation: A methodology review (Invited)  

NASA Astrophysics Data System (ADS)

Many industries have long since realised that applying macro-economic analysis methodologies to assess the socio-economic value of a programme is a critical step to convincing decision makers to authorise investment. The geospatial and earth observation industry has however been slow to embrace economic analysis. There are however a growing number of studies, published in the last few years, that have applied economic principles to this domain. They have adopted a variety of different approaches, including: - Computable General Equilibrium Modelling (CGE) - Revealed preference, stated preference (Willingness to Pay surveys) - Partial Analysis - Simulations - Cost-benefit analysis (with and without risk analysis) This paper will critically review these approaches and assess their applicability to different situations and to meet multiple objectives.

Coote, A. M.; Bernknopf, R.; Smart, A.

2013-12-01

395

MossWinn—methodological advances in the field of Mössbauer data analysis  

NASA Astrophysics Data System (ADS)

The methodology of Mössbauer data analysis has been advanced via the development of a novel scientific database system concept and its realization in the field of Mössbauer spectroscopy, as well as by the application of parallel computing techniques for the enhancement of the efficiency of various processes encountered in the practice of Mössbauer data handling and analysis. The present article describes the new database system concept along with details of its realization in the form of the MossWinn Internet Database (MIDB), and illustrates the performance advantage that may be realized on multi-core processor systems by the application of parallel algorithms for the implementation of database system functions.

Klencsár, Zoltán

2013-04-01

396

Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application  

PubMed Central

Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

Carracedo-Martinez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

2010-01-01

397

Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies  

NASA Astrophysics Data System (ADS)

This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

Zio, Enrico; Apostolakis, George E.

1999-03-01

398

SACO-1: a fast-running LMFBR accident-analysis code  

SciTech Connect

SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

1980-01-01

399

Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report  

SciTech Connect

This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein.

Gore, B.F.; Huenefeld, J.C.

1987-07-01

400

The Nuclear Organization and Management Analysis Concept methodology: Four years later  

SciTech Connect

The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

1992-01-01

401

The Nuclear Organization and Management Analysis Concept methodology: Four years later  

SciTech Connect

The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

1992-08-01

402

Impact of traffic congestion on road accidents: A spatial analysis of the M25 motorway in England  

Microsoft Academic Search

Traffic congestion and road accidents are two external costs of transport and the reduction of their impacts is often one of the primary objectives for transport policy makers. The relationship between traffic congestion and road accidents however is not apparent and less studied. It is speculated that there may be an inverse relationship between traffic congestion and road accidents, and

Chao Wang; Mohammed A. Quddus; Stephen G. Ison

2009-01-01

403

Analysis of an AP600 intermediate-size loss-of-coolant accident  

SciTech Connect

A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations performed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

Boyack, B.E.; Lime, J.F.

1995-04-01

404

Methodology for CFD Design Analysis of National Launch System Nozzle Manifold  

NASA Technical Reports Server (NTRS)

The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

Haire, Scot L.

1993-01-01

405

Transient Analysis for Evaluating the Potential Boiling in the High Elevation Emergency Cooling Units of PWR Following a Hypothetical Loss of Coolant Accident (LOCA) and Subsequent Water Hammer Due to Pump Restart  

SciTech Connect

The Generic Letter GL-96-06 issued by the U.S. Nuclear Regulatory Commission (NRC) required the utilities to evaluate the potential for voiding in their Containment Emergency Cooling Units (ECUs) due to a hypothetical Loss Of Coolant Accident (LOCA) or a Main Steam Line Break (MSLB) accompanied by the Loss Of Offsite Power (LOOP). When the offsite power is restored, the Component Cooling Water (CCW) pumps restart causing water hammer to occur due to cavity closure. Recently EPRI (Electric Power Research Institute) performed a research study that recommended a methodology to mitigate the water hammer due to cavity closure. The EPRI methodology allows for the cushioning effects of hot steam and released air, which is not considered in the conventional water column separation analysis. The EPRI study was limited in scope to the evaluation of water hammer only and did not provide any guidance for evaluating the occurrence of boiling and the extent of voiding in the ECU piping. This paper presents a complete methodology based on first principles to evaluate the onset of boiling. Also, presented is a methodology for evaluating the extent of voiding and the water hammer resulting from cavity closure by using an existing generalized computer program that is based on the Method of Characteristics. The EPRI methodology is then used to mitigate the predicted water hammer. Thus it overcomes the inherent complications and difficulties involved in performing hand calculations for water hammer. The heat transfer analysis provides an alternative to the use of very cumbersome modeling in using CFD (computational fluid dynamics) based computer programs. (authors)

Husaini, S. Mahmood; Qashu, Riyad K. [Southern California Edison, P.O. Box 128, San Clemente, CA 92672 (United States)

2004-07-01

406

Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2  

SciTech Connect

This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

Evans, J.S. [Harvard School of Public Health, Boston, MA (United States); Abrahmson, S. [Wisconsin Univ., Madison, WI (United States); Bender, M.A. [Brookhaven National Lab., Upton, NY (United States); Boecker, B.B.; Scott, B.R. [Inhalation Toxicology Research Inst., Albuquerque, NM (United States); Gilbert, E.S. [Battelle Pacific Northwest Lab., Richland, WA (United States)

1993-10-01

407

Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology  

SciTech Connect

The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs.

Youssef, M.Z.; Kumar, A.; Abdou, M.A. [Univ. of California, Los Angeles, CA (United States); Oyama, Y.; Maekawa, H. [Japan Atomic Energy Research Inst., Ibaraki (Japan)

1995-09-01

408

Generalized methodology for modeling and simulating optical interconnection networks using diffraction analysis  

NASA Astrophysics Data System (ADS)

Research in the field of free-space optical interconnection networks has reached a point where simula-tors and other design tools are desirable for reducing development costs and for improving design time. Previously proposed methodologies have only been applicable to simple systems. Our goal was to develop a simulation methodology capable of evaluating the performance characteristics for a variety of different free-space networks under a range of different configurations and operating states. The proposed methodology operates by first establishing the optical signal powers at various locations in the network. These powers are developed through the simulation by diffraction analysis of the light propagation through the network. After this evaluation, characteristics such as bit-error rate, signal-to-noise ratio, and system bandwidth are calculated. Further, the simultaneous evaluation of this process for a set of component misalignments provides a measure of the alignment tolerance of a design. We discuss this simulation process in detail as well as provide models for different optical interconnection network components.

Louri, Ahmed; Major, Michael C.

1995-07-01

409

Three years of the OCRA methodology in Brazil: critical analysis and results.  

PubMed

The Authors make a detailed analysis of the introduction of the OCRA Methodology in Brazil that started in August 2008 with the launching of the "OCRA Book" translated to Portuguese. They evaluate the importance of the assessment of the exposure of the upper limbs to the risk due to repetitive movements and efforts, according to the national and international legislation, demonstrating the interconnection of the OCRA Methodology with the Regulating Norms of the Ministry of Labor and Work (NRs - MTE), especially with the NR-17 and its Application Manual. They discuss the new paradigms of the OCRA Method in relation to the classic paradigms of the ergonomic knowledge. They indicate the OCRA Method as the tool to be used for the confirmation or not of the New Previdentiary Epidemiologic Nexus NTEP/FAP. The Authors present their conclusions based on the practical results the "participants certified by the OCRA Methodology" achieved in the application on different laboral activities in diverse economic segments, showing the risk reduction and the productivity of the companies. PMID:22316774

Ruddy, Facci; Eduardo, Marcatto; Edoardo, Santino

2012-01-01

410

Analysis of 303 Road Traffic Accident Victims Seen Dead on Arrival at Emergency Room-Assir Central Hospital  

PubMed Central

Background: Although Rood Traffic Accident (RTA) is a noticeable common cause of death in Saudi Arabia, there is no published data showing the relative frequency of this disease as a cause of death. Aim of the study: This study attempted to find out the relative frequency of RTA as a cause of death. Also, to identify age groups at risk as well as make some inferences from the different types of injuries seen. Methodology: In a period of over a four and half years, 574 patients were seen dead on arrival at the Emergency Department of Assir Central Hospital, Abha, Saudi Arabia. Of these, 303 (52.8%) were victims of RTA. Results: The 303 victims revealed a male to female ratio of 14:1, Saudi nationals of 69% and age range of 3 months - 85 years (mean = 34.25 years). The peak age group was between 21 and 49 years and the peak period of presentation at the Emergency Department was between 12:00 noon and 18:00 hours. The month of ten in Hegira Calendar represented the peak period; a significant (P<0.05) seasonal variations was also seen, summer being the highest. Clinical assessment of the victims revealed that head and neck injuries were the commonest followed by chest injuries. Conclusion: RTA is the primary cause of death among dead on arrival cases affecting the most active and productive age group. The study recommended the implementation of pre -hospital emergency medical system. PMID:23008545

Batouk, Abdul N.; Abu-Eisheh, Nader; Abu-Eshy, Saeed; Al-Shehri, Mohammad; AI-Naami, Mohammad; Jastaniah, Suleiman

1996-01-01

411

TRANSIT-HYDRO transition phase accident analysis code: an overview and recent improvements  

SciTech Connect

The TRANSIT-HYDRO computer code is being developed to provide a tool for assessing the consequences of transition phase events in a hypothetical core disruptive accident in an LMFBR. The TRANSIT-HYDRO code incorporates detailed geometric modeling on a subassembly-by-subassembly basis and detailed modeling of reactor material behavior and thermal and hydrodynamic phenomena. The purpose of this summary is to give a brief overview of the code and to describe recent improvements to the code, particularly the addition of new phenomenological models. Sample results with these new models are also described. The TRANSIT-HYDRO code is constructed on a modular basis, which allows models to be added or changed readily. In this summary, the emphasis is on the disrupted subassembly module which is used once the fuel pin inside the subassembly starts to disrupt.

Wigeland, R.A.; Briggs, L.L.

1985-01-01

412

TRAC large-break loss-of-coolant accident analysis for the AP600 design  

SciTech Connect

This report discusses a TRAC model of the Westinghouse AP600 advanced reactor design which has been developed for analyzing large-break loss-of-coolant accident (LBLOCA) transients. A preliminary LBLOCA calculation of a 80% cold-leg break has been performed with TRAC-PF1/MOD2. The 80% break size was calculated by Westinghouse to be the most severe large-break size. The LBLOCA transient was calculated to 92 s. Peak clad temperatures (PCT) were well below the Appendix K limit of 1478 K (2200{degrees}F). Transient event times and PCT for the TRAC calculation were in reasonable agreement with those calculated by Westinghouse using their WCOBRA/TRAC code.

Lime, J.F.; Boyack, B.E.

1994-02-01

413

Spectral Multi-peak Analysis Methodology for Eliminating Effects of NaI Temperature Drift  

SciTech Connect

This paper presents a methodology of spectral multi-peak analysis for eliminating temperature drift effects of NaI scintillation gamma-ray detectors. Properties and cost of NaI make it an excellent choice for low energy (< 1-MeV) spectroscopic studies. However, the light output of NaI scintillators has high temperature dependence, and the common practice of selecting a region of interest (ROI) for the peak of study in the measured spectrum requires that NaI temperature drift be monitored and corrected, typically by gain control of the measurement electronics.

Uckan, Taner [ORNL] [ORNL; March-Leuba, Jose A [ORNL] [ORNL; Brukiewa, Patrick D [ORNL] [ORNL; Upadhyaya, Belle R [ORNL] [ORNL

2010-01-01

414

Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis  

SciTech Connect

Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

2000-11-01

415

Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor  

NASA Astrophysics Data System (ADS)

The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.

Mkhabela, Peter Tshepo

416

Modeling and analysis of core debris recriticality during hypothetical severe accidents in the Advanced Neutron Source Reactor  

SciTech Connect

This paper discusses salient aspects of severe-accident-related recriticality modeling and analysis in the Advanced Neutron Source (ANS) reactor. The development of an analytical capability using the KENO V.A-SCALE system is described including evaluation of suitable nuclear cross-section sets to account for the effects of system geometry, mixture temperature, material dispersion and other thermal-hydraulic conditions. Benchmarking and validation efforts conducted with KENO V.A-SCALE and other neutronic codes against critical experiment data are described. Potential deviations and biases resulting from use of the 16-group Hansen-Roach library are shown. A comprehensive test matrix of calculations to evaluate the threat of a recriticality event in the ANS is described. Strong dependencies on geometry, material constituents, and thermal-hydraulic conditions are described. The introduction of designed mitigative features is described.

Taleyarkhan, R.P.; Kim, S.H.; Slater, C.O.; Moses, D.L.; Simpson, D.B.; Georgevich, V.

1993-05-01

417

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

1995-01-01

418

Cardiovascular risk analysis by means of pulse morphology and clustering methodologies.  

PubMed

The purpose of this study was the development of a clustering methodology to deal with arterial pressure waveform (APW) parameters to be used in the cardiovascular risk assessment. One hundred sixteen subjects were monitored and divided into two groups. The first one (23 hypertensive subjects) was analyzed using APW and biochemical parameters, while the remaining 93 healthy subjects were only evaluated through APW parameters. The expectation maximization (EM) and k-means algorithms were used in the cluster analysis, and the risk scores (the Framingham Risk Score (FRS), the Systematic COronary Risk Evaluation (SCORE) project, the Assessing cardiovascular risk using Scottish Intercollegiate Guidelines Network (ASSIGN) and the PROspective Cardiovascular Münster (PROCAM)), commonly used in clinical practice were selected to the cluster risk validation. The result from the clustering risk analysis showed a very significant correlation with ASSIGN (r=0.582, p<0.01) and a significant correlation with FRS (r=0.458, p<0.05). The results from the comparison of both groups also allowed to identify the cluster with higher cardiovascular risk in the healthy group. These results give new insights to explore this methodology in future scoring trials. PMID:25023535

Almeida, Vânia G; Borba, J; Pereira, H Catarina; Pereira, Tânia; Correia, Carlos; Pêgo, Mariano; Cardoso, João

2014-11-01

419

Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model''  

SciTech Connect

OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations.

Dmitriy Y. Anistratov; Marvin L. Adams; Todd S. Palmer; Kord S. Smith; Kevin Clarno; Hikaru Hiruta; Razvan Nes

2003-08-04

420

FRETmatrix: a general methodology for the simulation and analysis of FRET in nucleic acids  

PubMed Central

Förster resonance energy transfer (FRET) is a technique commonly used to unravel the structure and conformational changes of biomolecules being vital for all living organisms. Typically, FRET is performed using dyes attached externally to nucleic acids through a linker that complicates quantitative interpretation of experiments because of dye diffusion and reorientation. Here, we report a versatile, general methodology for the simulation and analysis of FRET in nucleic acids, and demonstrate its particular power for modelling FRET between probes possessing limited diffusional and rotational freedom, such as our recently developed nucleobase analogue FRET pairs (base–base FRET). These probes are positioned inside the DNA/RNA structures as a replacement for one of the natural bases, thus, providing unique control of their position and orientation and the advantage of reporting from inside sites of interest. In demonstration studies, not requiring molecular dynamics modelling, we obtain previously inaccessible insight into the orientation and nanosecond dynamics of the bases inside double-stranded DNA, and we reconstruct high resolution 3D structures of kinked DNA. The reported methodology is accompanied by a freely available software package, FRETmatrix, for the design and analysis of FRET in nucleic acid containing systems. PMID:22977181

Preus, S?ren; Kilsa, Kristine; Miannay, Francois-Alexandre; Albinsson, Bo; Wilhelmsson, L. Marcus

2013-01-01

421

Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans  

PubMed Central

Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

de Quiros, Yara Bernaldo; Gonzalez-Diaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; Di Guardo, Giovanni; Fernandez, Antonio

2011-01-01

422

Systems Approaches to Animal Disease Surveillance and Resource Allocation: Methodological Frameworks for Behavioral Analysis  

PubMed Central

While demands for animal disease surveillance systems are growing, there has been little applied research that has examined the interactions between resource allocation, cost-effectiveness, and behavioral considerations of actors throughout the livestock supply chain in a surveillance system context. These interactions are important as feedbacks between surveillance decisions and disease evolution may be modulated by their contextual drivers, influencing the cost-effectiveness of a given surveillance system. This paper identifies a number of key behavioral aspects involved in animal health surveillance systems and reviews some novel methodologies for their analysis. A generic framework for analysis is discussed, with exemplar results provided to demonstrate the utility of such an approach in guiding better disease control and surveillance decisions. PMID:24348922

Rich, Karl M.; Denwood, Matthew J.; Stott, Alistair W.; Mellor, Dominic J.; Reid, Stuart W. J.; Gunn, George J.

2013-01-01

423

Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans  

NASA Astrophysics Data System (ADS)

Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

2011-12-01

424

Multidimensional TMI-1 Main-Steam-Line-Break Analysis Methodology Using TRAC-PF/NEM  

SciTech Connect

A comparison of a point-kinetics calculation and a full three-dimensional thermal-hydraulic/kinetics calculation using TRAC-PF1/NEM is presented. The coupled TRAC-PF1/NEM methodology uses version 5.4 of the TRAC-PF1/MOD2 code, developed by the Los Alamos National Laboratory, and a special kinetics module, developed at The Pennsylvania State University and based on the nodal expansion method. Cross sections are obtained from two-dimensional tables generated using CASMO-3.The results of the analysis show that the point-kinetics calculation is conservative and predicts a return to power. The three-dimensional analysis shows no return to power despite an extended overfeeding of the affected generator with feedwater. The difference is believed to be caused by the inability of the standard point-kinetics method to properly account for the moderator density feedback, local effects, and flux redistribution, which occur during the transient.

Ivanov, Kostadin N. [Pennsylvania State University (United States); Beam, Tara M. [Pennsylvania State University (United States); Baratta, Anthony J. [Pennsylvania State University (United States); Irani, Ardesar; Trikouros, Nicholas G

2001-02-15

425

Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks  

NASA Technical Reports Server (NTRS)

Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.

Brown, Richard Lee

2008-01-01

426

Health effects models for nuclear power plant accident consequence analysis: Low LET radiation  

SciTech Connect

This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

Evans, J.S. (Harvard Univ., Boston, MA (USA). School of Public Health)

1990-01-01

427

Interdisciplinary safety analysis of complex socio-technological systems based on the functional resonance accident model: An application to railway trafficsupervision  

Microsoft Academic Search

This paper presents an application of functional resonance accident models (FRAM) for the safety analysis of complex socio-technological systems, i.e. systems which include not only technological, but also human and organizational components. The supervision of certain industrial domains provides a good example of such systems, because although more and more actions for piloting installations are now automatized, there always remains

Fabien Belmonte; Walter Schön; Laurent Heurley; Robert Capel

2011-01-01

428

Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models  

Microsoft Academic Search

Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear

J. S. Evans; D. W. Moeller; D. W. Cooper

1985-01-01

429

Radiation and epidemiological analysis for cancer incidence among nuclear workers who took part in recovery operations following the accident at the Chernobyl NPP  

Microsoft Academic Search

Results of the analysis of the relationship between dose and cancer incidence among nuclear workers, liquidators of the Chernobyl accident are given in the paper. Information on this cohort of individuals is accumulated at the regional centre of Russian National Medical and Dosimetric Registry, which is operated at the RF State Research Centre - Institute of Biophysics. Medical and dosimetric

430

Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident  

NASA Astrophysics Data System (ADS)

The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident (March 2011) are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem) model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

Hu, X.; Li, D.; Huang, H.; Shen, S.; Bou-Zeid, E.

2014-01-01

431

Discrete crack growth analysis methodology for through cracks in pressurized fuselage structures  

NASA Technical Reports Server (NTRS)

A methodology for simulating the growth of long through cracks in the skin of pressurized aircraft fuselage structures is described. Crack trajectories are allowed to be arbitrary and are computed as part of the simulation. The interaction between the mechanical loads acting on the superstructure and the local structural response near the crack tips is accounted for by employing a hierarchical modeling strategy. The structural response for each cracked configuration is obtained using a geometrically nonlinear shell finite element analysis procedure. Four stress intensity factors, two for membrane behavior and two for bending using Kirchhoff plate theory, are computed using an extension of the modified crack closure integral method. Crack trajectories are determined by applying the maximum tangential stress criterion. Crack growth results in localized mesh deletion, and the deletion regions are remeshed automatically using a newly developed all-quadrilateral meshing algorithm. The effectiveness of the methodology and its applicability to performing practical analyses of realistic structures is demonstrated by simulating curvilinear crack growth in a fuselage panel that is representative of a typical narrow-body aircraft. The predicted crack trajectory and fatigue life compare well with measurements of these same quantities from a full-scale pressurized panel test.

Potyondy, David O.; Wawrzynek, Paul A.; Ingraffea, Anthony R.

1994-01-01

432

Intelligent GIS-Based Road Accident Analysis and Real-Time Monitoring Automated System using WiMAX\\/GPRS  

Microsoft Academic Search

It has been a big concern for many people and government to reduce the amount of road accident specially in Malaysia since it could be a big threat to this country. Malaysian government has spent millions of money in order to reduce the number of accident occurrence through several modes of campaign. Unfortunately, from years to years the number keeps

Ahmad Rodzi Mahmud; Ehsan Zarrinbashar

433

A Tool and Methodology for AC-Stability Analysis of Continuous-Time Closed-Loop Systems  

Microsoft Academic Search

Presented are a methodology and a DFII-based tool for AC-stability analysis of a wide variety of closed-loop continuous-time (operational amplifiers and other linear circuits). The methodology used allows for easy identification and diagnostics of ac-stability problems including not only main-loop effects but also local-instability loops in current mirrors, bias circuits and emitter or source followers without breaking the loop. The

Momchil Milev; Rod Burt

2007-01-01

434

A Tool and Methodology for AC-Stability Analysis of Continuous-Time Closed-Loop Systems  

Microsoft Academic Search

Presented are a methodology and a DFII-based tool for AC-stability analysis of a wide variety of closed-loop continuous-time (operational amplifiers and other linear circuits). The methodology used allows for easy identification and diagnostics of ac-stability problems including not only main-loop effects but also local-instability loops in current mirrors, bias circuits and emitter or source followers without breaking the loop. The

Momchil Milev; Rod Burt

2005-01-01

435

Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)  

SciTech Connect

FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG&G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort.

Walsh, B.; Fisher, C.; Zigler, G.; Clark, R.A. [Science and Engineering Associates, Inc., Albuquerque, NM (United States)

1990-11-09

436

Wafer fab construction cost analysis and cost reduction strategies: applications of SEMATECH's future factory analysis methodology  

Microsoft Academic Search

This paper discusses semiconductor wafer fabrication (fab) factory construction costs as they relate to emerging technologies. The generation of factories studied represents facilities supporting 200 mm wafers, and products utilizing 0.25 micron line-width geometries. An analytical approach to categorizing and evaluating fab costs is presented. A pareto analysis of four recent factories' costs is presented. The organization of fab construction

David Art; Michael O'Halloran; Brian Butler

1994-01-01

437

BWR loss-of-coolant accident analysis capability of the WRAP-EM system  

Microsoft Academic Search

The modular computational system known as the Water Reactor Analysis Package (WRAP) has been extended to provide the computational tools required to perform a complete analysis of LOCAs in BWRs. The new system is known as the WRAP-EM (Evaluation Model) system and will be used by NRC in interpreting and evaluating reactor vendor EM methods and computed results. The system

M. R. Buckner; R. R. Beckmeyer; M. V. Gregory; R. L. Reed; W. G. Winn

1979-01-01

438

Shipping container response to severe highway and railway accident conditions: Appendices  

SciTech Connect

Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

1987-02-01

439