These are representative sample records from Science.gov related to your search topic.
For comprehensive and current results, perform a real-time search at Science.gov.
1

Linguistic methodology for the analysis of aviation accidents  

NASA Technical Reports Server (NTRS)

A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

Goguen, J. A.; Linde, C.

1983-01-01

2

Risk Estimation Methodology for Launch Accidents.  

SciTech Connect

As compact and light weight power sources with reliable, long lives, Radioisotope Power Systems (RPSs) have made space missions to explore the solar system possible. Due to the hazardous material that can be released during a launch accident, the potential health risk of an accident must be quantified, so that appropriate launch approval decisions can be made. One part of the risk estimation involves modeling the response of the RPS to potential accident environments. Due to the complexity of modeling the full RPS response deterministically on dynamic variables, the evaluation is performed in a stochastic manner with a Monte Carlo simulation. The potential consequences can be determined by modeling the transport of the hazardous material in the environment and in human biological pathways. The consequence analysis results are summed and weighted by appropriate likelihood values to give a collection of probabilistic results for the estimation of the potential health risk. This information is used to guide RPS designs, spacecraft designs, mission architecture, or launch procedures to potentially reduce the risk, as well as to inform decision makers of the potential health risks resulting from the use of RPSs for space missions.

Clayton, Daniel James; Lipinski, Ronald J.; Bechtel, Ryan D.

2014-02-01

3

Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis  

Microsoft Academic Search

A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the

C. Mueller; J. Roglans-Ribas; S. Folga; A. Huttenga; R. Jackson; W. TenBrook; J. Russell

1994-01-01

4

Applying STAMP in Accident Analysis  

NASA Technical Reports Server (NTRS)

Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

2003-01-01

5

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

1931-01-01

6

Assessment methodology for confidence in safety margin for large break loss of coolant accident sequences  

Microsoft Academic Search

Deterministic Safety Analysis and Probabilistic Safety Assessment (PSA) analyses are used to assess the Nuclear Power Plant (NPP) safety. The conventional deterministic analysis is conservative. The best estimate plus uncertainty analysis (BEPU) is increasingly being used for deterministic calculation in NPPs. The PSA methodology integrates information about the postulated accident, plant design, operating practices, component reliability and human behavior. The

Mahendra Prasad; R. S. Rao; S. K. Gupta

2011-01-01

7

Accident Tolerant Fuel Analysis  

SciTech Connect

Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

2014-09-01

8

ACCIDENT TOLERANT FUEL ANALYSIS  

SciTech Connect

Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and evaluate margin recovery strategies.

Smith, Curtis [Idaho National Laboratory; Chichester, Heather [Idaho National Laboratory; Johns, Jesse [Texas A& M University; Teague, Melissa [Idaho National Laboratory; Tonks, Michael Idaho National Laboratory; Youngblood, Robert [Idaho National Laboratory

2014-09-01

9

Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group  

SciTech Connect

The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

Brereton, S.; Shinn, J. [Lawrence Livermore National Lab., CA (United States); Hesse, D [Battelle Columbus Labs., OH (United States); Kaninich, D. [Westinghouse Savannah River Co., Aiken, SC (United States); Lazaro, M. [Argonne National Lab., IL (United States); Mubayi, V. [Brookhaven National Lab., Upton, NY (United States)

1997-08-01

10

Aircraft accidents : method of analysis  

NASA Technical Reports Server (NTRS)

This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

1929-01-01

11

Methodology for fitting and updating predictive accident models with trend.  

PubMed

Reliable predictive accident models (PAMs) (also referred to as Safety Performance Functions (SPFs)) have a variety of important uses in traffic safety research and practice. They are used to help identify sites in need of remedial treatment, in the design of transport schemes to assess safety implications, and to estimate the effectiveness of remedial treatments. The PAMs currently in use in the UK are now quite old; the data used in their development was gathered up to 30 years ago. Many changes have occurred over that period in road and vehicle design, in road safety campaigns and legislation, and the national accident rate has fallen substantially. It seems unlikely that these ageing models can be relied upon to provide accurate and reliable predictions of accident frequencies on the roads today. This paper addresses a number of methodological issues that arise in seeking practical and efficient ways to update PAMs, whether by re-calibration or by re-fitting. Models for accidents on rural single carriageway roads have been chosen to illustrate these issues, including the choice of distributional assumption for overdispersion, the choice of goodness of fit measures, questions of independence between observations in different years, and between links on the same scheme, the estimation of trends in the models, the uncertainty of predictions, as well as considerations about the most efficient and convenient ways to fit the required models. PMID:23612560

Connors, Richard D; Maher, Mike; Wood, Alan; Mountain, Linda; Ropkins, Karl

2013-07-01

12

Behavior Analysis: Methodological Foundations.  

ERIC Educational Resources Information Center

Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline or…

Owen, James L.

13

A systemic analysis of the Edge Hill railway accident.  

PubMed

The Edge Hill railway accident occurred on Sunday 9 May 1999 in Liverpool, England. An Engineers' scrap train struck a plant quality supervisor. This paper presents the results of a systemic analysis of the accident. The methodology has been to compare the features of the Edge Hill accident with the structural organization (i.e. systems 1-5) of a Systemic Safety Management System (SSMS) model, which has been constructed by employing the concepts of systems. A number of systemic failures have come to light. The findings are related to causal factors of failure of systems 1-5 as well as missing channels of communication amongst those involved in the maintenance work. It is hoped that this systemic analysis will help to identify 'learning points', which are relevant for preventing accidents; especially accidents involving track-side workers. PMID:19819361

Santos-Reyes, Jaime; Beard, Alan N

2009-11-01

14

Accident progression event tree analysis for postulated severe accidents at N Reactor  

SciTech Connect

A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. (Sandia National Labs., Albuquerque, NM (USA)); Medford, G.T. (Science Applications International Corp., Albuquerque, NM (USA))

1990-06-01

15

Single pilot IFR accident data analysis  

NASA Technical Reports Server (NTRS)

The aircraft accident data recorded by the National Transportation and Safety Board (NTSR) for 1964-1979 were analyzed to determine what problems exist in the general aviation (GA) single pilot instrument flight rule (SPIFR) environment. A previous study conducted in 1978 for the years 1964-1975 provided a basis for comparison. This effort was generally limited to SPIFR pilot error landing phase accidents but includes some SPIFR takeoff and enroute accident analysis as well as some dual pilot IFR accident analysis for comparison. Analysis was performed for 554 accidents of which 39% (216) occurred during the years 1976-1979.

Harris, D. F.

1983-01-01

16

Severe accident analysis using dynamic accident progression event trees  

Microsoft Academic Search

In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order

Aram P. Hakobyan

2006-01-01

17

Aircraft Loss-of-Control Accident Analysis  

NASA Technical Reports Server (NTRS)

Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

Belcastro, Christine M.; Foster, John V.

2010-01-01

18

A systems approach to food accident analysis  

E-print Network

Food borne illnesses lead to 3000 deaths per year in the United States. Some industries, such as aviation, have made great strides increasing safety through careful accident analysis leading to changes in industry practices. ...

Helferich, John D

2011-01-01

19

A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences  

SciTech Connect

This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

Budnitz, R.J.; Lambert, H.E.; Apostolakis, G. [and others] and others

1998-04-01

20

Probabilistic Accident Progression Analysis with application to a LMFBR design  

SciTech Connect

A method for probabilistic analysis of accident sequences in nuclear power plant systems referred to as ''Probabilistic Accident Progression Analysis'' (PAPA) is described. Distinctive features of PAPA include: (1) definition and analysis of initiator-dependent accident sequences on the component level; (2) a new fault-tree simplification technique; (3) a new technique for assessment of the effect of uncertainties in the failure probabilities in the probabilistic ranking of accident sequences; (4) techniques for quantification of dependent failures of similar components, including an iterative technique for high-population components. The methodology is applied to the Shutdown Heat Removal System (SHRS) of the Clinch River Breeder Reactor Plant during its short-term (0accident sequences. Dependent failures are shown to make the highest contribution to the system unavailabilities for all of the initiators that are considered. The probability of failure of the SHRS in short-term forced circulation per year is estimated at 2.6 x 10/sup -2/. Major contributors to this probability are the initiators loss of main feedwater system, loss of offsite power, and normal shutdown.

Jamali, K.M.

1982-01-01

21

An analysis of aircraft accidents involving fires  

NASA Technical Reports Server (NTRS)

All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

1975-01-01

22

Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods  

SciTech Connect

The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

2000-07-31

23

Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods  

SciTech Connect

The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

2000-08-01

24

HTGR severe accident sequence analysis  

SciTech Connect

Thermal-hydraulic, fission product transport, and atmospheric dispersion calculations are presented for hypothetical severe accident release paths at the Fort St. Vrain (FSV) high temperature gas cooled reactor (HTGR). Off-site radiation exposures are calculated for assumed release of 100% of the 24 hour post-shutdown core xenon and krypton inventory and 5.5% of the iodine inventory. The results show conditions under which dose avoidance measures would be desirable and demonstrate the importance of specific release characteristics such as effective release height. 7 tables.

Harrington, R.M.; Ball, S.J.; Kornegay, F.C.

1982-01-01

25

Anthropotechnological analysis of industrial accidents in Brazil.  

PubMed Central

The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

Binder, M. C.; de Almeida, I. M.; Monteau, M.

1999-01-01

26

Risk analysis methodology survey  

NASA Technical Reports Server (NTRS)

NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

Batson, Robert G.

1987-01-01

27

Single pilot IFR accident data analysis  

NASA Technical Reports Server (NTRS)

The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

Harris, D. F.; Morrisete, J. A.

1982-01-01

28

Accident patterns for construction-related workers: a cluster analysis  

NASA Astrophysics Data System (ADS)

The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

Liao, Chia-Wen; Tyan, Yaw-Yauan

2011-12-01

29

Accident patterns for construction-related workers: a cluster analysis  

NASA Astrophysics Data System (ADS)

The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

Liao, Chia-Wen; Tyan, Yaw-Yauan

2012-01-01

30

Comparison of techniques for accident scenario analysis in hazardous systems  

Microsoft Academic Search

In this paper, three accident scenario analysis techniques are presented and compared regarding their efficiency vs. the demanded resources. The complexity of modern industrial systems has prompted the development of accident analysis techniques that should thoroughly investigate accidents. The idea of criteria classification to fulfill this requirement has been proposed by other researchers and is examined here too. The comparison

Z. S. Nivolianitou; V. N. Leopoulos; M. Konstantinidou

2004-01-01

31

Recent Methodology in Ginseng Analysis  

PubMed Central

As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

2012-01-01

32

Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology  

NASA Technical Reports Server (NTRS)

Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

2012-01-01

33

Canister Storage Building (CSB) Design Basis Accident Analysis Documentation  

SciTech Connect

This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

CROWE, R.D.

1999-09-09

34

Canister Storage Building (CSB) Design Basis Accident Analysis Documentation  

SciTech Connect

This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

CROWE, R.D.; PIEPHO, M.G.

2000-03-23

35

Canister storage building design basis accident analysis documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

KOPELIC, S.D.

1999-02-25

36

RAILROAD ACCIDENT RATES FOR USE IN TRANSPORTATION RISK ANALYSIS  

Microsoft Academic Search

Annual safety statistics published by FRA provide train accident counts for various groupings, such as railroad, accident type, cause, track type and class, train length, and speed. However, hazardous materials transportation risk analysis often requires more detailed accident rate statistics for specific combinations of these groupings. The statistics that are presented enable more precise determination of the probability that Class

Robert T Anderson; Christopher P. L. Barkan

2004-01-01

37

Abstract -Preliminary risk analysis (PRA) is a methodology used in critical systems safety studies. It is primarily used at  

E-print Network

Abstract - Preliminary risk analysis (PRA) is a methodology used in critical systems safety studies the severity of the accident). The preliminary risk analysis was largely used in several industrial fields, Preliminary risk analysis (PRA), risk, potential accident, feared events, Automatic Train Control. I

Paris-Sud XI, Université de

38

Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

PIEPHO, M.G.

1999-10-20

39

Coupled thermal analysis applied to the study of the rod ejection accident  

SciTech Connect

An advanced methodology for the assessment of fuel-rod thermal margins under RIA conditions has been developed by AREVA NP SAS. With the emergence of RIA analytical criteria, the study of the Rod Ejection Accident (REA) would normally require the analysis of each fuel rod, slice by slice, over the whole core. Up to now the strategy used to overcome this difficulty has been to perform separate analyses of sampled fuel pins with conservative hypotheses for thermal properties and boundary conditions. In the advanced methodology, the evaluation model for the Rod Ejection Accident (REA) integrates the node average fuel and coolant properties calculation for neutron feedback purpose as well as the peak fuel and coolant time-dependent properties for criteria checking. The calculation grid for peak fuel and coolant properties can be specified from the assembly pitch down to the cell pitch. The comparative analysis of methodologies shows that coupled methodology allows reducing excessive conservatism of the uncoupled approach. (authors)

Gonnet, M. [AREVA NP, TOUR AREVA - 1 Place Jean MILLIER, 92084 Paris La Defense Cedex (France)

2012-07-01

40

Bayesian Network-Based Road Traffic Accident Causality Analysis  

Microsoft Academic Search

Traffic accident causality analysis is an important aspect in the traffic safety research field. Based on data survey and statistical analysis, a Bayesian network for traffic accident causality analysis was developed. The structure and parameter of the Bayesian network was learnt with K2 algorithm and Bayesian parameter estimation respectively. With the Junction Tree algorithm, the effect of road cross-section on

Xu Hongguo; Zhang Huiyong; Zong Fang

2010-01-01

41

PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS  

SciTech Connect

Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

(NOEMAIL), K; Jonathan Lowrie, J; David Thoman (NOEMAIL), D; Austin Keller (NOEMAIL), A

2008-07-30

42

Hazmat transport: a methodological framework for the risk analysis of marshalling yards.  

PubMed

A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation. PMID:17418942

Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino

2007-08-17

43

TMI-2 accident: core heat-up analysis  

SciTech Connect

This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

Ardron, K.H.; Cain, D.G.

1981-01-01

44

Aircraft accidents.method of analysis  

NASA Technical Reports Server (NTRS)

This report is a revision of NACA-TR-357. It was prepared by the Committee on Aircraft Accidents. The purpose of this report is to provide a basis for the classification and comparison of aircraft accidents, both civil and military.

1937-01-01

45

Application of Evidential Networks in quantitative analysis of railway accidents  

E-print Network

Application of Evidential Networks in quantitative analysis of railway accidents Felipe Aguirre1.belmonte@transport.alstom.com ABSTRACT Currently, a high percentage of accidents in railway systems are accounted to human factors reliability data are very difficult to quantify, thus, qualitative methods are often used in railway system

Paris-Sud XI, Université de

46

NASA's Accident Precursor Analysis Process and the International Space Station  

NASA Technical Reports Server (NTRS)

This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

Groen, Frank; Lutomski, Michael

2010-01-01

47

NRC's environmental analysis of nuclear accidents: is it adequate. Final report  

Microsoft Academic Search

The report evaluates the adequacy of accident analyses in environmental impact statements for nuclear power plants. It reviews the regulations and policies governing nuclear accident analyses in EISs, surveys the accident analyses in 149 EISs in 10 years, assesses the legal and scientific foundations of NRC's accident analysis policy, discusses the legal and pragmatic reasons for fuller EIS accident analysis,

E. Entwisle; D. Wexler

1980-01-01

48

An analysis of pilot error-related aircraft accidents  

NASA Technical Reports Server (NTRS)

A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

1974-01-01

49

Evaluation methodology for AUV energy systems analysis  

Microsoft Academic Search

An evaluation methodology is developed and applied to the analysis of heat engines, primary and secondary batteries, and fuel cells in an effort to evaluate energy systems for autonomous underwater vehicles (AUVs). The evaluation methodology reduced the number of independent variables to four, while the balance of 11 variables were shown to be dependent on the former. This generic method

William H. Kumm

1990-01-01

50

Realistic Small and Intermediate-Break Loss-of-Coolant Accident Analysis Using WCOBRA\\/TRAC  

Microsoft Academic Search

Since the 1988 Appendix K Rulemaking change, there has been significant interest in the development of codes and methodologies for 'best-estimate' analysis of loss-of-coolant accidents (LOCAs). Most development has been directed toward large-break (LB) LOCAs (LBLOCAs), since for most pressurized water reactors (PWRs), the LBLOCA generates the limiting peak cladding temperature (PCT). As plants age, are uprated, and continue to

Stephen M. Bajorek; Nikolay Petkov; Katsuhiro Ohkawa; Robert M. Kemper; Arthur P. Ginsberg

2001-01-01

51

Rat sperm motility analysis: methodologic considerations  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

52

Analysis of Credible Accidents for Argonaut Reactors  

SciTech Connect

Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

1981-04-01

53

Accident sequence analysis for a BWR (Boiling Water Reactor) during low power and shutdown operations  

SciTech Connect

Most previous Probabilistic Risk Assessments have excluded consideration of accidents initiated in low power and shutdown modes of operation. A study of the risk associated with operation in low power and shutdown is being performed at Sandia National Laboratories for a US Boiling Water Reactor (BWR). This paper describes the proposed methodology for the analysis of the risk associated with the operation of a BWR during low power and shutdown modes and presents preliminary information resulting from the application of the methodology. 2 refs., 2 tabs.

Whitehead, D.W.; Hake, T.M.

1990-01-01

54

Analysis of tritium mission FMEF/FAA fuel handling accidents  

SciTech Connect

The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

Van Keuren, J.C.

1997-11-18

55

PASSIVE SYSTEM ACCIDENT SCENARIO ANALYSIS BY Francesco Di Maio1  

E-print Network

PASSIVE SYSTEM ACCIDENT SCENARIO ANALYSIS BY SIMULATION Francesco Di Maio1 , Enrico Zio1,2 , Tao (RHRs), High Temperature Gas- Cooled Reactor ­ Pebble Bed Modular (HTR-PM), Scenario Analysis, Monte of the reactor core after shut-down. A thermal-hydraulic (T-H) model for scenario analysis has been implemented

Paris-Sud XI, Université de

56

Analysis Methodology for Industrial Load Profiles  

E-print Network

ANALYSIS METHODOLOGY FOR INDUSTRIAL LOAD PROFILES Thomas W. Reddoch Executive Vice President Eleclrolek Concepts, Inc. Knoxvillc, Tennessee ABSTRACT A methodology is provided for evaluating the impact of various demand-side management... (OSM) options on industrial customers. The basic approach uses customer metered load profile data as a basis for the customer load shape. OSM technologies are represented as load shapes and are used as a basis for altering the customers existing...

Reddoch, T. W.

57

The uses of evidence in accident analysis by professionally versus scientifically trained investigators  

Microsoft Academic Search

The purpose of the study was to analyze how experts having extensive practical experience in accident analysis (professional experts) deal with evidence while analyzing traffic accidents in comparison with participants who had experience in accident analysis as well as graduate scientific education (scientific experts). The study was conducted by giving scientifically and professionally educated accident investigators a set of authentic

K. Hakkarainen; D. R. Olson

2009-01-01

58

Foucault's Analysis of Power's Methodologies  

E-print Network

"that sovereign vanishing-point, indefinitely distant but constituent." 4 Throughout The Order of Things, what is decisive in Foucault's analysis of the centripetal movement of sovereignty is that the essential structure and privilege of sovereignty... construction of sexuality renders sex an instrument of power's design (150). Only by radically questioning the presupposition of sex as an "anchorage point that supports the manifestations of sexuality," Foucault writes, can we ever reconstruct such a...

Scott, Gary Alan

59

Hanford Waste Tank Bump Accident and Consequence Analysis  

SciTech Connect

This report provides a new evaluation of the Hanford tank bump accident analysis and consequences for incorporation into the Authorization Basis. The analysis scope is for the safe storage of waste in its current configuration in single-shell and double-shell tanks.

BRATZEL, D.R.

2000-06-20

60

Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1  

SciTech Connect

NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

1993-12-01

61

Waste form characterization and its relationship to transportation accident analysis  

SciTech Connect

The response of potential waste forms should be determined for extreme transportation environments that must be postulated for environmental impact analysis and also for hypothetical accident conditions to which packagings and contents must be subjected for licensing purposes. The best approach may be to test materials up to and beyond their failure point; such an approach would establish failure thresholds. Specification of what denotes failure would be defined by existing or proposed regulations or dictated by requirements developed from accident analysis. Responses to physical and thermal insults are the most important for licensing or analysis and need to be thoroughly characterized. Others in need of characterization might be responses to extreme chemical environments and to intense and prolonged radiation exposure. A complete characterization of waste-form responses would be desirable for environments that are considered extreme for transportation accidents but which may be typical for processing or disposal environments. In addition, the characterizations that are performed must be completed in laboratory environments which can be readily correlated to accident environments and must be meaningfully conveyed to a transportation impact analyst. As an example, leaching data as commonly presented are not usable to the analyst and are obtained under conditions that are not directly applicable to conditions of most transportation accidents. Transportation analysts are in need of data useful for calculating environmental impacts and for licensing of packagings. Future waste form development programs and associated decisions should consider the needs of transportation analysts.

Wilmot, E. L.; McClure, J. D.

1980-01-01

62

Accident Sequence Evaluation Program: Human reliability analysis procedure  

SciTech Connect

This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

Swain, A.D.

1987-02-01

63

Probabilistic methods for accident-progression analysis  

SciTech Connect

Probabilistic methods that can be used as basis for deterministic calculations of transients or accidents in nuclear power plants are described. They include obtaining initiator-dependent sequences on the component level and related analyses, propagation of primary event uncertainties in the ranking of sequences, and detailed treatment of dependent failures. The results are shown for protected transients in the short term forced circulation phase of decay heat removal in the Clinch River Breeder Reactor. Higher values of unavailabilities are obtained than previous works as a result of more detailed common cause/mode failure modeling. The unavailability of decay heat removal by forced circulation for the loss of off-site power and loss of main feedwater system initiators is estimated at 4 x 10/sup -3//yr and 9 x 10/sup -3//yr, respectively. 15 refs., 1 fig., 2 tabs.

Jamali, K. M.

1981-01-01

64

MELCOR accident analysis for ARIES-ACT  

SciTech Connect

We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

Paul W. Humrickhouse; Brad J. Merrill

2012-08-01

65

TMI-2 accident: core heat-up analysis. A supplement  

SciTech Connect

Following the accident at Three Mile Island, Unit 2, NSAC mounted an analytical program to develop a chronology of what happened in the core during the period when damage occurred. The central effort and key results of this analytical work are described in NSAC-24, TMI-2 Accident Core Heatup Analysis. Several supporting studies contributed to this central effort. These are presented in this supplement. Part I describes a single pin analysis that was made using the FRAP-T5 code. This analysis provided input to the core damage assessment central effort. Part II describes a thermal hydraulic analysis of the core during the accident using the BOIL 2 code. The BOIL 2 analysis of TMI-2 core was performed to provide an independent check on the results of the main core damage assessment effort. Part III provides the as-built design and material characteristics of the TMI-2 core. This supplement will be of greatest interest to analysts who are studying the TMI-2 accident or are investigating how other cores would behave during a boil-down event.

Not Available

1981-06-01

66

The Severe Accident Analysis Program for the Savannah River nuclear production reactors  

Microsoft Academic Search

Severe accident phenomena pertinent to the heavy-water-moderated production reactors of the US Department of Energy are being studied in the Severe Accident Analysis Program (SAAP) at the Savannah River Site. The SAAP has sought to define the behavior of the Savannah River reactors in accident scenarios involving significant fuel melting. The goal of the program is to make possible accident

Hyder

2009-01-01

67

Fault seal analysis: Methodology and case studies  

Microsoft Academic Search

Fault seal can arise from reservoir\\/non-reservoir juxtaposition or by development of fault rock of high entry-pressure. The methodology for evaluating these possibilities uses detailed seismic mapping and well analysis. A [open quote]first-order[close quote] seal analysis involves identifying reservoir juxtaposition areas over the fault surface, using the mapped horizons and a refined reservoir stratigraphy defined by isochores at the fault surface.

M. E. Badley; B. Freeman; D. T. Needham

1996-01-01

68

Analysis of the 1957-1958 Soviet Nuclear Accident  

Microsoft Academic Search

The presence of an extensive environmental contamination zone in Cheliabinsk Province of the Soviet Union, associated with an accident in the winter of 1957 to 1958 involving the atmospheric release of fission wastes, appears to have been confirmed, primarily by an analysis of the Soviet radioecology literature. The contamination zone is estimated to contain 105 to 106 curies of strontium-90

John R. Trabalka; L. Dean Eyman; Stanley I. Auerbach

1980-01-01

69

Fractal analysis: methodologies for biomedical researchers.  

PubMed

Fractal analysis has become a popular method in all branches of scientific investigations including biology and medicine. Although there is a growing interest in the application of fractal analysis in biological sciences, questions about the methodology of fractal analysis have partly restricted its wider and comprehensible application. It is a notable fact that fractal analysis is derived from fractal geometry, but there are some unresolved issues that need to be addressed. In this respect, we discuss several related underlying principles for fractal analysis and establish the meaningful relationship between fractal analysis and fractal geometry. Since some concepts in fractal analysis are determined descriptively and/or qualitatively, this paper provides their exact mathematical definitions or explanations. Another aim of this study is to show that nowadays fractal analysis is an independent mathematical and experimental method based on Mandelbrot's fractal geometry, Euclidean traditiontal geometry and Richardson's coastline method. PMID:23757956

Ristanovi?, Dusan; Milosevi?, Nebojsa T

2012-01-01

70

Three dimensional effects in analysis of PWR steam line break accident  

E-print Network

A steam line break accident is one of the possible severe abnormal transients in a pressurized water reactor. It is required to present an analysis of a steam line break accident in the Final Safety Analysis Report (FSAR) ...

Tsai, Chon-Kwo

71

The role of safety analysis in accident prevention.  

PubMed

The need for safety analysis has grown in the fields of nuclear industry, civil and military aviation and space technology where the potential for accidents with far-reaching consequences for employees, the public and the environment is most apparent. Later the use of safety analysis has spread widely to other industrial branches. General systems theory, accident theories and scientific management represent domains that have influenced the development of safety analysis. These relations are shortly presented and the common methods employed in safety analysis are described and structured according to the aim of the search and to the search strategy. A framework for the evaluation of the coverage of the search procedures employed in different methods of safety analysis is presented. The framework is then used in an heuristic and in an empiric evaluation of hazard and operability study (HAZOP), work safety analysis (WSA), action error analysis (AEA) and management oversight and risk tree (MORT). Finally, some recommendations on the use of safety analysis for preventing accidents are presented. PMID:3337767

Suokas, J

1988-02-01

72

Cold Vacuum Drying facility design basis accident analysis documentation  

SciTech Connect

This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

CROWE, R.D.

2000-08-08

73

Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident  

NASA Technical Reports Server (NTRS)

Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

1994-01-01

74

Mass Spectrometry Methodology in Lipid Analysis  

PubMed Central

Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics technology greatly push forward the study of lipidomics. Among them, mass spectrometry (MS) is the most important technology for lipid analysis. In this review, the methodology based on MS for lipid analysis was introduced. It is believed that along with the rapid development of MS and its further applications to lipid analysis, more functional lipids will be identified as biomarkers and therapeutic targets and for the study of the mechanisms of disease. PMID:24921707

Li, Lin; Han, Juanjuan; Wang, Zhenpeng; Liu, Jian’an; Wei, Jinchao; Xiong, Shaoxiang; Zhao, Zhenwen

2014-01-01

75

Reactor accident consequence analysis code (MACCS)  

Microsoft Academic Search

Sandia National Laboratories has been involved in the performance of risk assessments for the US Nuclear Regulatory Commission for more than a decade. As part of this effort, Sandia developed the reactor consequence analysis codes, CRAC2, and more recently, MACCS. CRAC2 is an improved version of CRAC, which was used in the Reactor Safety Study (also known as WASH-1400). MACCS

Hong-Nian Jow; J. L. Sprung; D. I. Chanin; J. C. Helton; J. A. Rollstin; NM Albuquerque

1990-01-01

76

Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

77

Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis  

SciTech Connect

This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

Hall, B.W.

1996-09-25

78

Integrated severe accident containment analysis with the CONTAIN computer code  

Microsoft Academic Search

Analysis of physical and radiological conditions iunside the containment building during a severe (core-melt) nuclear reactor accident requires quantitative evaluation of numerous highly disparate yet coupled phenomenologies. These include two-phase thermodynamics and thermal-hydraulics, aerosol physics, fission product phenomena, core-concrete interactions, the formation and combustion of flammable gases, and performance of engineered safety features. In the past, this complexity has meant

K. D. Bergeron; D. C. Williams; P. E. Rexroth; J. L. Tills

1985-01-01

79

Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs  

NASA Technical Reports Server (NTRS)

A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

Tuomela, C. H.; Brennan, M. F.

1980-01-01

80

Review of sixty two risk analysis methodologies of industrial plants  

E-print Network

Review of sixty two risk analysis methodologies of industrial plants TIXIER J.1 , DUSSERRE G methodologies have been developed to lead a risk analysis on an industrial plant. In this paper, sixty two, hiérarchisation). This classification allowed to detail risk analysis methodology. In the aim to understand

Paris-Sud XI, Université de

81

RAMS (Risk Analysis - Modular System) methodology  

SciTech Connect

The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

1996-10-01

82

Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology  

NASA Astrophysics Data System (ADS)

With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

Bhalla, P.; Tripathi, S.; Palria, S.

2014-12-01

83

Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.  

PubMed

Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended. PMID:25179119

Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

2014-09-01

84

NASA Accident Precursor Analysis Handbook, Version 1.0  

NASA Technical Reports Server (NTRS)

Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate accident precursors by evaluating anomaly occurrences for their system safety implications and, through both analytical and deliberative methods used to project to other circumstances, identifying those that portend more serious consequences to come if effective corrective action is not taken. APA builds upon existing safety analysis processes currently in practice within NASA, leveraging their results to provide an improved understanding of overall system risk. As such, APA represents an important dimension of safety evaluation; as operational experience is acquired, precursor information is generated such that it can be fed back into system safety analyses to risk-inform safety improvements. Importantly, APA utilizes anomaly data to predict risk whereas standard reliability and PRA approaches utilize failure data which often is limited and rare.

Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

2011-01-01

85

Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis  

SciTech Connect

The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

Gilles Youinou; R. Sonat Sen

2013-09-01

86

Comprehensive Analysis of Two Downburst-Related Aircraft Accidents  

NASA Technical Reports Server (NTRS)

Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

Shen, J.; Parks, E. K.; Bach, R. E.

1996-01-01

87

A DISCIPLINED APPROACH TO ACCIDENT ANALYSIS DEVELOPMENT AND CONTROL SELECTION  

SciTech Connect

The development and use of a Safety Input Review Committee (SIRC) process promotes consistent and disciplined Accident Analysis (AA) development to ensure that it accurately reflects facility design and operation; and that the credited controls are effective and implementable. Lessons learned from past efforts were reviewed and factored into the development of this new process. The implementation of the SIRC process has eliminated many of the problems previously encountered during Safety Basis (SB) document development. This process has been subsequently adopted for use by several Savannah River Site (SRS) facilities with similar results and expanded to support other analysis activities.

Ortner, T; Mukesh Gupta, M

2007-04-13

88

Combining task analysis and fault tree analysis for accident and incident analysis: A case study from Bulgaria  

Microsoft Academic Search

Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident\\/incident, while the latter

Doytchin E. Doytchev; Gerd Szwillus

2009-01-01

89

An Integrated Accident & Consequence Analysis Approach for Accidental Releases through Multiple Leak Paths  

SciTech Connect

This paper presents a consequence analysis for a postulated fire accident on a building containing plutonium when the resulting outside release is partly through the ventilation/filtration system and partly through other pathways such as building access doorways. When analyzing an accident scenario involving the release of radioactive powders inside a building, various pathways for the release to the outside environment can exist. This study is presented to guide the analyst on how the multiple building leak path factors (combination of filtered and unfiltered releases) can be evaluated in an integrated manner starting with the source term calculation and proceeding through the receptor consequence determination. The analysis is performed in a two-step process. The first step of the analysis is to calculate the leak path factor, which represents the fraction of respirable radioactive powder that is made airborne that leaves the building through the various pathways. The computer cod e of choice for this determination is MELCOR. The second step is to model the transport and dispersion of powder material released to the atmosphere and to estimate the resulting dose that is received by the downwind receptors of interest. The MACCS computer code is chosen for this part of the analysis. This work can be used as model for performing analyses for systems similar in nature where releases can propagate to the outside environment via filtered and unfiltered pathways. The methodology provides guidance to analysts outlining the essential steps needed to perform a sound and defensible consequence analysis.

POLIZZI, LM

2004-04-28

90

Use of Human Factors Analysis for Wildland Fire Accident Investigations  

Microsoft Academic Search

Accident investigators at any level are challenged with identifying causal factors and making preventative recommendations. This task can be particularly complicated considering that 70-80% of accidents are associated with human error. Due to complexities of the wildland fire environment, this is especially challenging when investigating a wildland fire-related accident. Upon reviewing past accident investigations within the United States Federal wildland

Michelle Ryerson; Chuck Whitlock

2005-01-01

91

BESAFE II: Accident safety analysis code for MFE reactor designs  

NASA Astrophysics Data System (ADS)

The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications to BESAFE II is discussed in Chapter 6, for example, by adding additional environmental indices such as a waste disposal index. The biggest improvement to BESAFE II would be an increase in the database of activation product mobilization for a larger spectrum of fusion reactor materials. The ultimate goal we have is for BESAFE II to become part of a systems design program which would include economic factors and allow both safety and the cost of electricity to influence design.

Sevigny, Lawrence Michael

92

A general methodology for population analysis  

NASA Astrophysics Data System (ADS)

For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ?, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ?, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn,  n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

Lazov, Petar; Lazov, Igor

2014-12-01

93

Predicting System Accidents with Model Analysis During Hybrid Simulation  

NASA Technical Reports Server (NTRS)

Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

Malin, Jane T.; Fleming, Land D.; Throop, David R.

2002-01-01

94

Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.  

PubMed

According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

2014-04-01

95

[Epidemiologic analysis of occupational accidents using data of the Swiss National Accident Insurance].  

PubMed

The present study is aimed to relate occupational accidents, as recorded by SAIF for 1974, to several risk factors among which the size of industrial enterprises. The results seem to demonstrate that very small enterprises (of less than ten employees) produce the highest rate of occupational accidents and accumulate risk factors. Nevertheless, before going very much further in the comment of results, the authors emphasize the limitations of the available statistical material. Further limitation in planning an ergonomical prevention programme on the basis of those records is to be related to the parameters chosen to explain the genesis of occupational accidents. PMID:7136290

Gressot, M; Rey, P

1982-09-01

96

Thermohydraulic and Safety Analysis for CARR Under Station Blackout Accident  

SciTech Connect

A thermohydraulic and safety analysis code (TSACC) has been developed using Fortran 90 language to evaluate the transient thermohydraulic behaviors and safety characteristics of the China Advanced Research Reactor(CARR) under Station Blackout Accident(SBA). For the development of TSACC, a series of corresponding mathematical and physical models were considered. Point reactor neutron kinetics model was adopted for solving reactor power. All possible flow and heat transfer conditions under station blackout accident were considered and the optional models were supplied. The usual Finite Difference Method (FDM) was abandoned and a new model was adopted to evaluate the temperature field of core plate type fuel element. A new simple and convenient equation was proposed for the resolution of the transient behaviors of the main pump instead of the complicated four-quadrant model. Gear method and Adams method were adopted alternately for a better solution to the stiff differential equations describing the dynamic behaviors of the CARR. The computational result of TSACC showed the enough safety margin of CARR under SBA. For the purpose of Verification and Validation (V and V), the simulated results of TSACC were compared with those of Relap5/Mdo3. The V and V result indicated a good agreement between the results by the two codes. Because of the adoption of modular programming techniques, this analysis code is expected to be applied to other reactors by easily modifying the corresponding function modules. (authors)

Wenxi Tian; Suizheng Qiu; Guanghui Su; Dounan Jia [Xi'an Jiaotong University, 28 Xianning Road, Xi'an 710049 (China); Xingmin Liu - China Institute of Atomic Energy

2006-07-01

97

Offsite radiological consequence analysis for the bounding aircraft crash accident  

SciTech Connect

The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

OBERG, B.D.

2003-03-22

98

Decontamination analysis of the NUWAX-83 accident site using DECON  

SciTech Connect

This report presents an analysis of the site restoration options for the NUWAX-83 site, at which an exercise was conducted involving a simulated nuclear weapons accident. This analysis was performed using a computer program deveoped by Pacific Northwest Laboratory. The computer program, called DECON, was designed to assist personnel engaged in the planning of decontamination activities. The many features of DECON that are used in this report demonstrate its potential usefulness as a site restoration planning tool. Strategies that are analyzed with DECON include: (1) employing a Quick-Vac option, under which selected surfaces are vacuumed before they can be rained on; (2) protecting surfaces against precipitation; (3) prohibiting specific operations on selected surfaces; (4) requiring specific methods to be used on selected surfaces; (5) evaluating the trade-off between cleanup standards and decontamination costs; and (6) varying of the cleanup standards according to expected exposure to surface.

Tawil, J.J.

1983-11-01

99

METHODOLOGY FOR PASSIVE ANALYSIS OF A UNIVERSITY INTERNET LINK / TALK 13 Methodology for Passive Analysis of a University  

E-print Network

METHODOLOGY FOR PASSIVE ANALYSIS OF A UNIVERSITY INTERNET LINK / TALK 13 Methodology for Passive # Abstract--- Passive monitoring of Internet links can e#­ ciently provide valuable data on a wide variety and research questions using our passive measurement methodology. Keywords--- Network Performance Monitoring

California at San Diego, University of

100

Toward a Methodology of Stakeholder Analysis.  

ERIC Educational Resources Information Center

Proposes a methodology for the comprehensive identification of stakeholders and their changing definitions and roles. The methodology links stakeholders to system tasks, management activities, and the best moments for interventions. Examples are taken from education, but have applications elsewhere. The method is used to analyze educational reform…

Welsh, Thomas; McGinn, Noel

1997-01-01

101

An analysis of evacuation options for nuclear accidents  

SciTech Connect

In this report we consider the threat posed by the accidental release of radionuclides from a nuclear power plant. The objective is to establish relationships between radiation dose and the cost of evacuation under a wide variety of conditions. The dose can almost always be reduced by evacuating the population from a larger area. However, extending the evacuation zone outward will cause evacuation costs to increase. The purpose of this analysis was to provide the Environmental Protection Agency (EPA) a data base for evaluating whether implementation costs and risks averted could be used to justify evacuation at lower doses. The procedures used and results of these analyses are being made available as background information for use by others. We develop cost/dose relationships for 54 scenarios that are based upon the severity of the reactor accident, meteorological conditions during the release of radionuclides into the environment, and the angular width of the evacuation zone. The 54 scenarios are derived from combinations of three accident severity levels, six meteorological conditions and evacuation zone widths of 70{degree}, 90{degree}, and 180{degree}.

Tawil, J.J.; Strenge, D.L.; Schultz, R.W. [Battelle Memorial Inst., Richland, WA (United States)

1987-11-01

102

An Accident Precursor Analysis Process Tailored for NASA Space Systems  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

2010-01-01

103

Accidents and injuries in competitive Enduro motorcyclists: a prospective analysis.  

PubMed

The Erzberg Rodeo Motor Enduro race, was analyzed over three consecutive years to analyze the risk of sustaining an accident, to determine the kind and site of injured body regions and to correlate the incidence of accidents with the site on the race track suspiring that most accidents happen at the first- and the last-third. In this prospective, field study questionnaires were used recoding fallen riders. Demographic data, the third of the race track on which the accident happened and details of the accident itself like its mechanism were noted. Each injured body region was recorded separately and rated according to the abbreviated injury scale. Two thousand nine hundred and twenty-three athletes started in 3 years in the Erzberg Rodeo; 6% of them had an accident and 94% were non-professionals. Overall, the average abbreviated injury scale was 2.8. More than 80% of all counted injuries were superficial. Most accidents happened in a curve due to the front tire sliding, affecting mostly arm and leg. Overall 67% of all accidents happened at day 1 of the race and 41% of all accidents happened in the first-third of the race track. Competitive Enduro motorcyclists have a high risk of sustaining an accident, but in comparison a low risk of sustaining a severe injury. Curves are the predominant site of accidents. PMID:19183956

Sabeti-Aschraf, M; Serek, M; Pachtner, T; Geisler, M; Auner, K; Machinek, M; Funovics, Philipp; Goll, A; Schmidt, M

2009-06-01

104

An accident-severity analysis for a uniform-spacing headway policy  

Microsoft Academic Search

Future automatic highway systems should operate at high capacities (?3600 vehicles\\/lane\\/h) over a range of highway speeds (13-26.8 m\\/s). Under such conditions it would be impossible to eliminate accidents. Here, a methodology to ascertain the severity of one especially critical accident, multivehicle collisions resulting from the emergency braking of a platoon of automatically controlled, closely spaced vehicles, is presented. This

J. Glimm; ROBERT E. FENTON

1980-01-01

105

An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)  

NASA Technical Reports Server (NTRS)

A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

2002-01-01

106

Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System  

SciTech Connect

Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

WILLIAMS, J.C.

2000-09-15

107

Aircraft Accident Prevention: Loss-of-Control Analysis  

NASA Technical Reports Server (NTRS)

The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

2009-01-01

108

Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident  

PubMed Central

In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms. PMID:22496858

Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

2012-01-01

109

Siting MSW landfills with a spatial multiple criteria analysis methodology  

Microsoft Academic Search

The present work describes a spatial methodology which comprises several methods from different scientific fields such as multiple criteria analysis, geographic information systems, spatial analysis and spatial statistics. The final goal of the methodology is to evaluate the suitability of the study region in order to optimally site a landfill. The initial step is the formation of the multiple criteria

Themistoklis D. Kontos; Dimitrios P. Komilis; Constantinos P. Halvadakis

2005-01-01

110

The accident analysis of mobile mine machinery in Indian opencast coal mines.  

PubMed

This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines. PMID:23324038

Kumar, R; Ghosh, A K

2014-01-01

111

Recent advances in analysis of PWR containment bypass accidents  

Microsoft Academic Search

The Reactor Safety Study identified and quantified the contribution to off-site radiological risks of accident sequences at pressurized water reactors (PWRs) in which the release of fission products may be released by bypassing the containment building. These so-called bypass accidents were also referred to as interfacing systems loss-of-coolant accidents (LOCAs) or Event 5 sequences due to the postulated failure of

E. A. Warman; J. E. Metcalf; M. L. Donahue

1991-01-01

112

Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident  

NASA Astrophysics Data System (ADS)

Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

Su'ud, Zaki; Anshari, Rio

2012-06-01

113

Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident  

SciTech Connect

Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

Su'ud, Zaki; Anshari, Rio [Nuclear and Biophysics Research Group, Dept. of Physics, Bandung Institute of Technology, Jl.Ganesha 10, Bandung, 40132 (Indonesia)

2012-06-06

114

PWR integrated safety analysis methodology using multi-level coupling algorithm  

NASA Astrophysics Data System (ADS)

Coupled three-dimensional (3D) neutronics/thermal-hydraulic (T-H) system codes give a unique opportunity for a realistic modeling of the plant transients and design basis accidents (DBA) occurring in light water reactors (LWR). Examples of such DBAs are the rod ejection accidents (REA) and the main steam line break (MSLB) that constitute the bounding safety problems for pressurized water reactors (PWR). These accidents involve asymmetric 3D spatial neutronic and T-H effects during the course of the transients. The thermal margins (the peak fuel temperature, and departure from nucleate boiling ratio (DNBR)) are the measures of safety at a particular transient and need to be evaluated as accurate as possible. Modern 3D neutronics/T-H coupled codes estimate the safety margins coarsely on an assembly level, i.e. for an average fuel pin. More accurate prediction of the safety margins requires the evaluation of the transient fuel rod response involving locally coupled neutronics/T-H calculations. The proposed approach is to perform an on-line hot-channel safety analysis not for the whole core but for a selected local region, for example for the highest power loaded fuel assembly. This approach becomes feasible if an on-line algorithm capable to extract the necessary input data for a sub-channel module is available. The necessary input data include the detailed pin-power distributions and the T-H boundary conditions for each sub-channel in the considered problem. Therefore, two potential challenges are faced in the development of refined methodology for evaluation of local safety parameters. One is the development of an efficient transient pin-power reconstruction algorithm with a consistent cross-section modeling. The second is the development of a multi-level coupling algorithm for the T-H boundary and feed-back data exchange between the sub-channel module and the main 3D neutron kinetics/T-H system code, which already uses one level of coupling scheme between 3D neutronics and core thermal-hydraulics models. The major accomplishment of the thesis is the development of an integrated PWR safety analysis methodology with locally refined safety evaluations. This involved introduction of an improved method capable of efficiently restoring the fine pin-power distribution with a high degree of accuracy. In order to apply the methodology to evaluate the safety margins on a pin level, a refined on-line hot channel model was developed accounting for the cross-flow effects. Finally, this methodology was applied to best estimate safety analysis to more accurately calculate the thermal safety margins occurring during a design basis accident in PWR.

Ziabletsev, Dmitri Nickolaevich

115

Method for risk analysis of nuclear reactor accidents  

Microsoft Academic Search

A method is developed for deriving a set of equations relating the public risk in potential nuclear reactor accidents to the basic variables, such as population distributions and radioactive releases, which determine the consequences of the accidents. The equations can be used to determine the risk for different values of the basic variables without the need of complex computer programs

M. Maekawa; N. C. Rasmussen; W. E. Jr. Vesely

1978-01-01

116

300-Area accident analysis for Emergency Planning Zones  

SciTech Connect

The Department of Energy has requested SRL assistance in developing offsite Emergency Planning Zones (EPZs) for the Savannah River Plant, based on projected dose consequences of atmospheric releases of radioactivity from potential credible accidents in the SRP operating areas. This memorandum presents the assessment of the offsite doses via the plume exposure pathway from the 300-Area potential accidents. 8 refs., 3 tabs.

Pillinger, W.L.

1983-06-27

117

The Methodology of Search Log Analysis  

E-print Network

; Wang, Berry, & Yang, 2003). Web search engine companies use searchlogs and Web search engines in order to facilitate its use as a research methodology. A three-stage process asearchingepisodebetweenaWebsearchengine and users searching for information on that Web search engine. A Web search engine

Jansen, James

118

Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.  

SciTech Connect

This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d'%C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

2011-06-01

119

Zonal prices analysis supported by a data mining based methodology  

Microsoft Academic Search

A methodology based on data mining techniques to support the analysis of zonal prices in real transmission networks is proposed in this paper. The mentioned methodology uses clustering algorithms to group the buses in typical classes that include a set of buses with similar LMP values. Two different clustering algorithms have been used to determine the LMP clusters: the two-step

Judite Ferreira; Sérgio Ramos; Zita Vale; J. P. Soares

2010-01-01

120

The GO-FLOW methodology  

SciTech Connect

A reliability analysis using the GO-FLOW methodology is given for the emergency core cooling system (ECCS) of a marine reactor experiencing either a collision or a grounding accident. The analysis is an example of a phased mission problem, and the system is a relatively large system with 90 components. An overview of the GO-FLOW methodology, a description of the ECCS, and the analysis procedure are given. Time-dependent mission unreliabilities under three accident conditions are obtained by one GO-FLOW chart with one computer run. The GO-FLOW methodology has proved to be a useful tool for probabilistic safety assessments of actual systems.

Matsuoka, T.; Kobayashi, M.; Takemura, K.

1989-03-01

121

Nonresponse analysis and adjustment in a mail survey on car accidents.  

PubMed

Statistical accident data plays an important role for traffic safety development involving the road system, vehicle design, and driver education. Vehicle manufacturers use data from accident mail surveys as an integral part of the product development process. Low response rates has, however, lead to concerns on whether estimates from a mail survey can be trusted as a source for making strategic decisions. The main objective of this paper was to investigate nonresponse bias in a mail survey addressing driver behaviour in accident situations. Insurance data, available for both respondents and nonrespondents were used to analyze, as well as adjust for nonresponse. Response propensity was investigated by using descriptive statistics and logistic regression analyses. The survey data was then weighted by using inverse propensity weights. Two specific examples of survey estimates are addressed, namely driver vigilance and driver's distraction just before the accident. The results from this paper reveal that driver age and accident type were the most influential variables for nonresponse weighting. Driver gender and size of town where the driver resides also had some influence, but not for all survey variables investigated. The main conclusion of this paper is that nonresponse weighting can increase confidence in accident data collected by a mail survey, especially when response rates are low. Weighting has a moderate influence on this survey, but a larger influence may be expected if applied on a more diverse driver population. The development of auxiliary data collection can further improve accident mail survey methodology in future. PMID:22664706

Tivesten, Emma; Jonsson, Sofia; Jakobsson, Lotta; Norin, Hans

2012-09-01

122

Analysis of Construction Accidents in Turkey and Responsible Parties  

PubMed Central

Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

GÜRCANLI, G. Emre; MÜNGEN, U?ur

2013-01-01

123

Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1  

SciTech Connect

Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

Oztunali, O.I.; Roles, G.W.

1986-01-01

124

Analysis of Loss-of-Coolant Accidents in the NBSR  

SciTech Connect

This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

Baek J. S.; Cheng L.; Diamond, D.

2014-05-23

125

Protein MAS NMR methodology and structural analysis of protein assemblies  

E-print Network

Methodological developments and applications of solid-state magic-angle spinning nuclear magnetic resonance (MAS NMR) spectroscopy, with particular emphasis on the analysis of protein structure, are described in this thesis. ...

Bayro, Marvin J

2010-01-01

126

DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)  

SciTech Connect

This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

Young, K. R.; Augustine, C.; Anderson, A.

2010-02-01

127

Modeling control room crews for accident sequence analysis  

E-print Network

This report describes a systems-based operating crew model designed to simulate the behavior of an nuclear power plant control room crew during an accident scenario. This model can lead to an improved treatment of potential ...

Huang, Y. (Yuhao)

1991-01-01

128

Risk analysis using a hybrid Bayesian-approximate reasoning methodology.  

SciTech Connect

Analysts are sometimes asked to make frequency estimates for specific accidents in which the accident frequency is determined primarily by safety controls. Under these conditions, frequency estimates use considerable expert belief in determining how the controls affect the accident frequency. To evaluate and document beliefs about control effectiveness, we have modified a traditional Bayesian approach by using approximate reasoning (AR) to develop prior distributions. Our method produces accident frequency estimates that separately express the probabilistic results produced in Bayesian analysis and possibilistic results that reflect uncertainty about the prior estimates. Based on our experience using traditional methods, we feel that the AR approach better documents beliefs about the effectiveness of controls than if the beliefs are buried in Bayesian prior distributions. We have performed numerous expert elicitations in which probabilistic information was sought from subject matter experts not trained In probability. We find it rnuch easier to elicit the linguistic variables and fuzzy set membership values used in AR than to obtain the probability distributions used in prior distributions directly from these experts because it better captures their beliefs and better expresses their uncertainties.

Bott, T. F. (Terrence F.); Eisenhawer, S. W. (Stephen W.)

2001-01-01

129

Shuttle TPS thermal performance and analysis methodology  

NASA Technical Reports Server (NTRS)

Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

1983-01-01

130

Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis  

NASA Technical Reports Server (NTRS)

NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

2005-01-01

131

Finite element analysis applied to dentoalveolar trauma: methodology description.  

PubMed

Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated. PMID:21991463

da Silva, B R; Moreira Neto, J J S; da Silva, F I; de Aguiar, A S W

2011-01-01

132

[Accidents between motorcycles: analysis of cases that occurred in the state of Paraná between July 2010 and June 2011].  

PubMed

Statistics for accidents between two motorcycles have been overlooked in the vast number of traffic accidents in Brazil, though they deserve closer analysis. This study sought to conduct an epidemiological analysis into accidents between two motorcycles compared with other accidents based on data in the state of Paraná. Information from the Fire Department site was collected for a period of one year (July 2010 to June 2011), reporting the number and type of accident, day of the week, time, number of victims, gender, age and severity of injuries. Accidents involving two motorcycles represented 3.4% of traffic accidents and 6.2% of accidents involving motorcycles; and the victims of these accidents accounted respectively for 4.4% of victims of traffic accidents and 8.5% victims of motorcycle accidents. Accidents occurring on Saturdays involving males aged between 20 and 29 were more common. Among the ten most populated cities in the state, some revealed high accident rate between two motorcycles, which appears to be related to the total number of motorcycles in the cities concerned. Thus, constant analysis of these indices is essential together with the implementation of measures to ensure safer highway traffic. PMID:23670451

Golias, Andrey Rogério Campos; Caetano, Rosângela

2013-05-01

133

ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS  

SciTech Connect

This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

WILLIAMS, J.C.

2003-11-15

134

ATWS at Browns Ferry Unit One - accident sequence analysis  

SciTech Connect

This study describes the predicted response of Unit One at the Browns Ferry Nuclear Plant to a postulated complete failure to scram following a transient occurrence that has caused closure of all Main Steam Isolation Valves (MSIVs). This hypothetical event constitutes the most severe example of the type of accident classified as Anticipated Transient Without Scram (ATWS). Without the automatic control rod insertion provided by scram, the void coefficient of reactivity and the mechanisms by which voids are formed in the moderator/coolant play a dominant role in the progression of the accident. Actions taken by the operator greatly influence the quantity of voids in the coolant and the effect is analyzed in this report. The progression of the accident sequence under existing and under recommended procedures is discussed. For the extremely unlikely cases in which equipment failure and wrongful operator actions might lead to severe core damage, the sequence of emergency action levels and the associated timing of events are presented.

Harrington, R.M.; Hodge, S.A.

1984-07-01

135

Analysis and methodology for aeronautical systems technology program planning  

NASA Technical Reports Server (NTRS)

A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

White, M. J.; Gershkoff, I.; Lamkin, S.

1983-01-01

136

Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence  

NASA Technical Reports Server (NTRS)

Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

2004-01-01

137

Radiochemical Analysis Methodology for uranium Depletion Measurements  

SciTech Connect

This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

Scatena-Wachel DE

2007-01-09

138

RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS  

EPA Science Inventory

The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

139

Global-local methodologies and their application to nonlinear analysis  

NASA Technical Reports Server (NTRS)

An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.

Noor, Ahmed K.

1989-01-01

140

The Gaia Methodology for Agent-Oriented Analysis and Design  

Microsoft Academic Search

This article presents Gaia: a methodology for agent-oriented analysis and design. The Gaia methodology is both general, in that it is applicable to a wide range of multi-agent systems, and com- prehensive, in that it deals with both the macro-level (societal) and the micro-level (agent) aspects of systems. Gaia is founded on the view of a multi-agent system as a

Michael Wooldridge; Nicholas R. Jennings; David Kinny

2000-01-01

141

A methodology for probabilistic fault displacement hazard analysis (PFDHA)  

USGS Publications Warehouse

We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H., III; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

2003-01-01

142

Sensitivity analysis of radionuclides atmospheric dispersion following the Fukushima accident  

NASA Astrophysics Data System (ADS)

Atmospheric dispersion models are used in response to accidental releases with two purposes: - minimising the population exposure during the accident; - complementing field measurements for the assessment of short and long term environmental and sanitary impacts. The predictions of these models are subject to considerable uncertainties of various origins. Notably, input data, such as meteorological fields or estimations of emitted quantities as function of time, are highly uncertain. The case studied here is the atmospheric release of radionuclides following the Fukushima Daiichi disaster. The model used in this study is Polyphemus/Polair3D, from which derives IRSN's operational long distance atmospheric dispersion model ldX. A sensitivity analysis was conducted in order to estimate the relative importance of a set of identified uncertainty sources. The complexity of this task was increased by four characteristics shared by most environmental models: - high dimensional inputs; - correlated inputs or inputs with complex structures; - high dimensional output; - multiplicity of purposes that require sophisticated and non-systematic post-processing of the output. The sensitivities of a set of outputs were estimated with the Morris screening method. The input ranking was highly dependent on the considered output. Yet, a few variables, such as horizontal diffusion coefficient or clouds thickness, were found to have a weak influence on most of them and could be discarded from further studies. The sensitivity analysis procedure was also applied to indicators of the model performance computed on a set of gamma dose rates observations. This original approach is of particular interest since observations could be used later to calibrate the input variables probability distributions. Indeed, only the variables that are influential on performance scores are likely to allow for calibration. An indicator based on emission peaks time matching was elaborated in order to complement classical statistical scores which were dominated by deposit dose rates and almost insensitive to lower atmosphere dose rates. The substantial sensitivity of these performance indicators is auspicious for future calibration attempts and indicates that the simple perturbations used here may be sufficient to represent an essential part of the overall uncertainty.

Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien

2014-05-01

143

Revisiting Methodological Issues in Transcript Analysis: Negotiated Coding and Reliability  

ERIC Educational Resources Information Center

Transcript analysis is an important methodology to study asynchronous online educational discourse. The purpose of this study is to revisit reliability and validity issues associated with transcript analysis. The goal is to provide researchers with guidance in coding transcripts. For validity reasons, it is suggested that the first step is to…

Garrison, D. R.; Cleveland-Innes, M.; Koole, Marguerite; Kappelman, James

2006-01-01

144

CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL  

SciTech Connect

Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask-loading-specific conditions could be performed to demonstrate that release is within the allowable leak rates of the cask.

Vinson, D.

2010-07-11

145

INVITED EDITORIAL: Uncertainties in probabilistic nuclear accident consequence analysis  

Microsoft Academic Search

National Radiological Protection Board, Chilton, Didcot, Oxon OX11 0RQ, UK For all nuclear installations there is a small probability of an accident occurring which could lead to a release of radionuclides into the environment, despite the design intent to build the nuclear plant in such a way as to reduce that possibility to a low level. It is therefore important

M. P. Little

1998-01-01

146

Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion  

SciTech Connect

Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

2012-09-30

147

Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory  

SciTech Connect

A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

Kim, S.H.; Taleyarkhan, R.P.

1994-01-01

148

THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT  

SciTech Connect

Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

Gupta, N.

2011-02-14

149

A Global Sensitivity Analysis Methodology for Multi-physics Applications  

SciTech Connect

Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

Tong, C H; Graziani, F R

2007-02-02

150

Scram discharge volume break studies accident sequence analysis  

SciTech Connect

This paper is a summary of a report describing the predicted response of Unit 1 at the Tennessee Valley Authority (TVA) Browns Ferry Nuclear Plant to a hypothetical small break loss of coolant accident (SBLOCA) outside of containment. The accident studied would be initiated by a break in the scram discharge volume (SDV) piping when it is pressurized to full reactor vessel pressure as a normal consequence of a reactor scram. If the scram could be reset, the scram outlet valves would close to isolate the SDV and the piping break from the reactor vessel. However, reset is possible only if the conditions that caused the scram have cleared; it has been assumed in this study that the scram signal remains in effect over a long period of time.

Harrington, R.M.; Hodge, S.A.

1982-01-01

151

NMR methodologies in the analysis of blueberries.  

PubMed

An NMR analytical protocol based on complementary high and low field measurements is proposed for blueberry characterization. Untargeted NMR metabolite profiling of blueberries aqueous and organic extracts as well as targeted NMR analysis focused on anthocyanins and other phenols are reported. Bligh-Dyer and microwave-assisted extractions were carried out and compared showing a better recovery of lipidic fraction in the case of microwave procedure. Water-soluble metabolites belonging to different classes such as sugars, amino acids, organic acids, and phenolic compounds, as well as metabolites soluble in organic solvent such as triglycerides, sterols, and fatty acids, were identified. Five anthocyanins (malvidin-3-glucoside, malvidin-3-galactoside, delphinidin-3-glucoside, delphinidin-3-galactoside, and petunidin-3-glucoside) and 3-O-?-l-rhamnopyranosyl quercetin were identified in solid phase extract. The water status of fresh and withered blueberries was monitored by portable NMR and fast-field cycling NMR. (1) H depth profiles, T2 transverse relaxation times and dispersion profiles were found to be sensitive to the withering. PMID:24668393

Capitani, Donatella; Sobolev, Anatoly P; Delfini, Maurizio; Vista, Silvia; Antiochia, Riccarda; Proietti, Noemi; Bubici, Salvatore; Ferrante, Gianni; Carradori, Simone; De Salvador, Flavio Roberto; Mannina, Luisa

2014-06-01

152

Recent advances in analysis of PWR containment bypass accidents  

SciTech Connect

The Reactor Safety Study identified and quantified the contribution to off-site radiological risks of accident sequences at pressurized water reactors (PWRs) in which the release of fission products may be released by bypassing the containment building. These so-called bypass accidents were also referred to as interfacing systems loss-of-coolant accidents (LOCAs) or Event 5 sequences due to the postulated failure of valves separating the high-pressure reactor coolant system (RCS) from low-pressure piping located outside containment. Containment bypass sequence risks constitute a large fraction of the total pressurized water reactor (PWR) in NUREG-1150 in large part because estimates of competing risks from early containment failures have been greatly reduced since WASH-1400. Rigorous analyses of both SGTR and V sequence bypass sequences result in reductions in fission product release to such an extent that in-containment sequences are expected to dominate PWR risks at levels substantially lower than reported in NUREG-1150. It is important that these findings be confirmed by other investigators, particularly in light of the NRC's ongoing study of the frequency of occurrence of interfacing systems. LOCAs based on extensive investigations at operating plants. Progress in this latter effort should be matched by progress in the knowledge and understanding of the progression of bypass sequences, once initiated.

Warman, E.A.; Metcalf, J.E.; Donahue, M.L. (Stone and Webster Engineering Corp., Boston, MA (United States))

1991-01-01

153

‘Doing’ health policy analysis: methodological and conceptual reflections and challenges  

PubMed Central

The case for undertaking policy analysis has been made by a number of scholars and practitioners. However, there has been much less attention given to how to do policy analysis, what research designs, theories or methods best inform policy analysis. This paper begins by looking at the health policy environment, and some of the challenges to researching this highly complex phenomenon. It focuses on research in middle and low income countries, drawing on some of the frameworks and theories, methodologies and designs that can be used in health policy analysis, giving examples from recent studies. The implications of case studies and of temporality in research design are explored. Attention is drawn to the roles of the policy researcher and the importance of reflexivity and researcher positionality in the research process. The final section explores ways of advancing the field of health policy analysis with recommendations on theory, methodology and researcher reflexivity. PMID:18701552

Walt, Gill; Shiffman, Jeremy; Schneider, Helen; Murray, Susan F; Brugha, Ruairi; Gilson, Lucy

2008-01-01

154

Advanced Power Plant Development and Analysis Methodologies  

SciTech Connect

Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

2006-06-30

155

The Data Analysis In Epidemiology: A Methodology To Reduce Complexity  

NASA Astrophysics Data System (ADS)

The paper represents a trial of defining and testing a systematic methodology to study medical phenomena in which large sets of variables of an essentially qualitative kind are envolved. The methodology con sists in the organized application of the various methods of the Data Analysis and the Multivariate Statistics to the epidemiology in order to analyze phenomena very complex as to the numerousness and variety of the variables, to evaluate the environment influence, to understand the phenomenon evolution. The data analysis techniques play the role both of tools for synthesizing and validating indexes and of methods for formulating models of the disease propagation, by defining the prognostic function. The methodology is applied to define the epidemiological model of the Essential Arterial Hypertension.

Esposito, F.

1982-11-01

156

GO-FLOW; A new reliability analysis methodology  

SciTech Connect

In this paper a new reliability analysis methodology, GO-FLOW, is presented. Detailed explanations and two examples of GO-GLOW analysis are given. The GO-FLOW is a success-oriented system analysis technique. The modeling technique produces the GO-FLOW chart, which is composed of operators and signal lines and represents a function of the system. The examples of analysis show the applicability of the GO-FLOW method to a phased mission problem and to a time-dependent unavailability analysis.

Matsuoka, T.; Kobayashi, M. (Ship Research Inst., Tokyo (Japan))

1988-01-01

157

A Methodological Analysis of Self-Control in Applied Settings.  

ERIC Educational Resources Information Center

A methodological analysis presents 26 self control investigations (self instructions, self determination of contingencies, self evaluation, self goal setting, and self determination and/or administration of external reinforcement) conducted in classroom settings with learning and/or behaviorally handicapped students. Studies compare subjects and…

Abion, Fred M.

1983-01-01

158

Development of The Purdue Cognitive Job Analysis Methodology  

Microsoft Academic Search

The objective of this article is to develop a cognitive job and task analysis methodology that not only analyzes jobs and tasks, but also provides a mechanism for improving cognitive job and task perfor- mance. Two phases were used to achieve this objective. The 1st phase developed a human-centered cognitive performance (HCCP) model based on human information processing. To quantitatively

June Wei; Gavriel Salvendy

2000-01-01

159

Interpersonal Dynamics in a Simulated Prison: A Methodological Analysis  

ERIC Educational Resources Information Center

A critical overview is presented of the Stanford Prison Experiment, conducted by Zimbardo and his coinvestigators in which they attempted a structural analysis of the problems of imprisonment. Key assumptions are questioned, primarily on methodological grounds, which casts doubts on the plausibility of the experimenters' final causal inferences.…

Banuazizi, Ali; Movahedi, Siamak

1975-01-01

160

Hazardous materials transportation: a risk-analysis-based routing methodology  

Microsoft Academic Search

This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short

P. Leonelli; S. Bonvicini; G. Spadoni

2000-01-01

161

Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

1996-12-01

162

Proposed methodology for completion of disruptive scenario analysis  

SciTech Connect

This report presents the methodology to complete an assessment of post-closure performance under disruption scenario conditions for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening disruption scenarios, and for then assessing the assessment. The results of the disruption scenario analysis and the nominal-case performance assessment are combined to comprehensively determine system performance for evaluation of compliance with post-closure performance criteria. 49 refs., 24 figs., 1 tab.

Davis, J.D.

1984-01-01

163

Radioactivity analysis following the Fukushima Dai-ichi nuclear accident.  

PubMed

A total of 118 samples were analyzed using HPGe ?-spectrometry. (131)I, (134)Cs, (137)Cs and (136)Cs were detected in aerosol air samples that were collected 22 days after the accident with values of 1720 µBq m(-)³, 247 µBq m(-)³, 289 µBq m(-)³ and 23 µBq m(-)³, respectively. (131)I was detected in rainwater and soil samples and was also measurable in vegetables collected between April 2 and 13, 2011, with values ranging from 0.55 Bq kg(-1) to 2.68 Bq kg(-1). No (131)I was detected in milk, drinking water, seawater or marine biota samples. PMID:23685724

Tuo, Fei; Xu, Cuihua; Zhang, Jing; Zhou, Qiang; Li, Wenhong; Zhao, Li; Zhang, Qing; Zhang, Jianfeng; Su, Xu

2013-08-01

164

A Comparative Analysis of Methodologies for Database Schema Integration  

E-print Network

One of the fundamental principles of the database approach is that a database allows a nonredundant, unified representation of all data managed in an organization. This is achieved only when methodologies are available to support integration across organizational and application boundaries. Methodologies for database design usually perform the design activity by separately producing several schemas, representing parts of the application, which are subsequently merged. Database schema integration is the activity of integrating the schemas of existing or proposed databases into a global, unified schema. The aim of the paper is to provide first a unifying framework for the problem of schema integration, then a comparative review of the work done thus far in this area. Such a framework, with the associated analysis of the existing approaches, provides a basis for identifying strengths and weaknesses of individual methodologies, as well as general guidelines for future improvements and extensions.

C. Batini; M. Lenzerini; S. B. Navathe

1986-01-01

165

Methodological Variability Using Electronic Nose Technology For Headspace Analysis  

NASA Astrophysics Data System (ADS)

Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

Knobloch, Henri; Turner, Claire; Spooner, Andrew; Chambers, Mark

2009-05-01

166

Two methodologies for optical analysis of contaminated engine lubricants  

NASA Astrophysics Data System (ADS)

The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant-object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function, coherence function, etc) are used for the analysis of combined object-lubricant images. Both proposed methodologies utilize the comparison of measured parameters and calculated object shape-based and statistical characteristics for fresh and contaminated lubricants. Developed methodologies are verified experimentally showing an ability to distinguish lubricant with 0%, 3%, 7% and 10% water and coolant contamination. This proves the potential applicability of the developed methodologies for on-line measurement, monitoring and control of the engine lubricant condition.

Aghayan, Hamid; Bordatchev, Evgueni; Yang, Jun

2012-01-01

167

Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.  

PubMed

Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

2006-09-01

168

Assessment of groundwater contamination resulting from a major accident in land nuclear power plants (LNPP), I: Concepts and methodology  

NASA Astrophysics Data System (ADS)

Hydrological site suitability is examined on the basis of potential groundwater pollution associated with major hypothetical accidents of reasonable probability. Loss of Coolant Accident (LOCA) is considered here as the Maximum Design Basis Event in nuclear power plants. Two alternative nuclide paths, resulting in groundwater contamination are considered: (a) core penetration through the basement, bringing possibly a major part of the nuclide inventory of the reactor into a direct contact with underlying groundwaters, or alternatively (b) major nuclide releases to the atmosphere, resulting in their wide spread as fallout, thus endangering the exploitability of underlying aquifers over large areas. These are referred to commonly as point-source and diffused-source contamination, respectively. Contamination analyses, related to the point-source scenario, are derived according to known analytical solutions of the convection-dispersion differential equation for absorbable and decaying species.

Mercado, Abraham

1989-12-01

169

Risk Analysis Methodology for Kistler's K-1 Reusable Launch Vehicle  

NASA Astrophysics Data System (ADS)

Missile risk analysis methodologies were originally developed in the 1940s as the military experimented with intercontinental ballistic missile (ICBM) technology. As the range of these missiles increased, it became apparent that some means of assessing the risk posed to neighboring populations was necessary to gauge the relative safety of a given test. There were many unknowns at the time, and technology was unpredictable at best. Risk analysis itself was in its infancy. Uncertainties in technology and methodology led to an ongoing bias toward conservative assumptions to adequately bound the problem. This methodology ultimately became the Casualty Expectation Analysis that is used to license Expendable Launch Vehicles (ELVs). A different risk analysis approach was adopted by the commercial aviation industry in the 1950s. At the time, commercial aviation technology was more firmly in hand than ICBM technology. Consequently commercial aviation risk analysis focused more closely on the hardware characteristics. Over the years, this approach has enabled the advantages of technological and safety advances in commercial aviation hardware to manifest themselves in greater capabilities and opportunities. The Boeing 777, for example, received approval for trans-oceanic operations "out of the box," where all previous aircraft were required, at the very least, to demonstrate operations over thousands of hours before being granted such approval. This "out of the box" approval is likely to become standard for all subsequent designs. In short, the commercial aircraft approach to risk analysis created a more flexible environment for industry evolution and growth. In contrast, the continued use of the Casualty Expectation Analysis by the launch industry is likely to hinder industry maturation. It likely will cause any safety and reliability gains incorporated into RLV design to be masked by the conservative assumptions made to "bound the problem." Consequently, for the launch industry to mature, a different approach to RLV risk analysis must be adopted. This paper will present such a methodology for Kistler's K-1 reusable launch vehicle. This paper will develop an approach to risk analysis that represents an amalgamation of the two approaches. This methodology provides flexibility to the launch industry that will enable the regulatory environment to more efficiently accommodate new technologies and approaches. It will also present a derivation of an appropriate assessment threshold that is the equivalent of the currently accepted 30-in-a-million casualty expectation.

Birkeland, Paul W.

2002-01-01

170

RELAP5 Application to Accident Analysis of the NIST Research Reactor  

SciTech Connect

Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

2012-03-18

171

Atmospheric transport analysis used in hazard screening methodology  

SciTech Connect

Simple, but conservative, atmospheric transport models are used in the initial stages of a hazard screening methodology to determine a preliminary hazard rank. The hazard rank is one indicator of the additional effort, if any, that must be applied to determine if a system is safe. Simple methods avoid prolonged calculations at this early stage when details of potential accidents may be poorly defined. The models are used to simulate the consequences resulting from accidental releases of toxic substances. Instantaneous and constant-rate releases are considered. If a release takes place within a relatively small enclosure, the close-in transport is approximated by assuming the airborne material is instantaneously mixed with the volume of air within this enclosure. For all other situations and large distances, the transport is estimated with simple atmospheric dispersion models using published values of dispersion coefficients for large distances, and values based on turbulent diffusion theory for close-in distances. Consequences are assessed by defining exposure levels that are equivalent to negligible, reversible, and irreversible health effects. The hazard rank is related to the number and location of people within each category of health effects.

Bloom, S.G.

1992-06-29

172

Frontal Impact Analysis of Human Skull for Accident Reconstruction  

E-print Network

Abstract: Road traffic accidents, falls and assaults cases are the most frequently cited causes of head impact. Predictive of human head impact index have been developed since 1960s to help the investigation of human head injury. Approaches using finite element method can provide interesting tools for the forensic scientists and car crash evaluations to evaluate various human head injury mechanisms. In this study, a full frontal car collision is reconstructed using finite element method to predict skull fractures. Details from the traffic and collision reports are extracted to create the computational environment. Results shows that the frontal impacts of the human skull involved a concussion between the skull and the windshield. Fractures are predicted at the respective velocity and initiated at the impact contact area.

173

Content Analysis—A Methodological Primer for Gender Research  

Microsoft Academic Search

This article is intended to serve as a primer on methodological standards for gender scholars pursuing content analytic research.\\u000a The scientific underpinnings of the method are explored, including the roles of theory, past research, population definition,\\u000a objectivity\\/intersubjectivity, reliability, validity, generalizability, and replicability. Both human coding and computer\\u000a coding are considered. The typical process of human-coded content analysis is reviewed, including

Kimberly A. Neuendorf

2011-01-01

174

Fluid-structure interaction analysis of a hypothetical core disruptive accident in LMFBRs  

Microsoft Academic Search

To ensure safety, it is necessary to assess the integrity of a reactor vessel of liquid-metal fast breeder reactor (LMFBR) under HCDA. Several important problems for a fluid-structural interaction analysis of HCDA are discussed in the present paper. Various loading models of hypothetical core disruptive accident (HCDA) are compared and the polytropic processes of idea gas (PPIG) law is recommended.

Chuang Liu; Xiong Zhang; Ming-Wan Lu

2005-01-01

175

Natural Language Processing (NLP) tools for the analysis of incident and accident reports  

E-print Network

Natural Language Processing (NLP) tools for the analysis of incident and accident reports project, we use NLP methods to facilitate experience feedback in the field of civil aviation safety. In this paper, we present how NLP methods based on the extraction of textual information from the Air France ASR

Paris-Sud XI, Université de

176

Parametric analysis of the failure of a VVER-440 vessel during a serious accident  

Microsoft Academic Search

Analysis of the integrity of the reactor vessel under accident conditions with partial or complete destruction of the core is a key aspect of validating the safety of a reactor facility with VVER. When cooling is lost for a prolonged period of time, movement and accumulation of fused fragments of the core at the bottom of the vessel formation of

V. D. Loktionov

2007-01-01

177

An Efficient Analysis Methodology for Fluted-Core Composite Structures  

NASA Technical Reports Server (NTRS)

The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

Oremont, Leonard; Schultz, Marc R.

2012-01-01

178

Analysis of Reactivity Induced Accident for Control Rods Ejection with Loss of Cooling  

E-print Network

Understanding of the time-dependent behavior of the neutron population in nuclear reactor in response to either a planned or unplanned change in the reactor conditions, is a great importance to the safe and reliable operation of the reactor. In the present work, the point kinetics equations are solved numerically using stiffness confinement method (SCM). The solution is applied to the kinetics equations in the presence of different types of reactivities and is compared with different analytical solutions. This method is also used to analyze reactivity induced accidents in two reactors. The first reactor is fueled by uranium and the second is fueled by plutonium. This analysis presents the effect of negative temperature feedback with the addition positive reactivity of control rods to overcome the occurrence of control rod ejection accident and damaging of the reactor. Both power and temperature pulse following the reactivity- initiated accidents are calculated. The results are compared with previous works and...

Saad, Hend Mohammed El Sayed; Wahab, Moustafa Aziz Abd El

2013-01-01

179

Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices  

SciTech Connect

This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

Not Available

1988-12-15

180

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9  

SciTech Connect

This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

181

Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing  

NASA Technical Reports Server (NTRS)

Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

2014-01-01

182

ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES.  

SciTech Connect

Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice.

XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C. (BNL); GRAVES, H. (US NRC).

2005-07-01

183

Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components  

NASA Technical Reports Server (NTRS)

The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

2000-01-01

184

An integrated risk analysis methodology in a multidisciplinary design environment  

NASA Astrophysics Data System (ADS)

Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary environment consists of the interface between configuration and sizing analysis outputs and aerodynamic parameter computations. Uncertainties are analyzed for both simulation tools and their associated input parameters. Uncertainties are then propagated across the design environment and a robust design optimization is performed over the range of a critical input parameter. The results of this research indicate that including uncertainties into design processes may require modification of design constraints previously considered acceptable in deterministic analyses.

Hampton, Katrina Renee

185

Development of test methodology for dynamic mechanical analysis instrumentation  

NASA Technical Reports Server (NTRS)

Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

Allen, V. R.

1982-01-01

186

Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines  

PubMed Central

Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

Baka, Aikaterini D.; Uzunoglu, Nikolaos K.

2014-01-01

187

Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines.  

PubMed

Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

Baka, Aikaterini D; Uzunoglu, Nikolaos K

2014-09-01

188

SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident  

NASA Astrophysics Data System (ADS)

On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

2014-06-01

189

76 FR 30139 - Federal Need Analysis Methodology for the 2012-2013 Award Year  

Federal Register 2010, 2011, 2012, 2013, 2014

...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2012-2013 Award Year AGENCY...revision of the Federal Need Analysis Methodology for the 2012-2013 award year...84.379]. Federal Need Analysis Methodology for the 2012-2013 award year;...

2011-05-24

190

Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [GRAM, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

191

A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies  

SciTech Connect

This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

Woodward, C S; Estep, D; Sandelin, J; Wang, H

2009-02-26

192

Defect analysis methodology for contact hole grapho epitaxy DSA  

NASA Astrophysics Data System (ADS)

Next-generation lithography technology is required to meet the needs of advanced design nodes. Directed Self Assembly (DSA) is gaining momentum as an alternative or complementary technology to EUV lithography. We investigate defectivity on a 2xnm patterning of contacts for 25nm or less contact hole assembly by grapho epitaxy DSA technology with guide patterns printed using immersion ArF negative tone development. This paper discusses the development of an analysis methodology for DSA with optical wafer inspection, based on defect source identification, sampling and filtering methods supporting process development efficiency of DSA processes and tools.

Harukawa, Ryota; Aoki, Masami; Cross, Andrew; Nagaswami, Venkat; Kawakami, Shinichiro; Yamauchi, Takashi; Tomita, Tadatoshi; Nagahara, Seiji; Muramatsu, Makoto; Kitano, Takahiro

2014-04-01

193

SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES  

SciTech Connect

Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

Coutts, D

2007-04-17

194

77 FR 31600 - Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal Pell Grant, Federal...  

Federal Register 2010, 2011, 2012, 2013, 2014

...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2013-2014 Award Year: Federal...statutory ``Federal Need Analysis Methodology'' to determine a student's expected...tables used in the Federal Need Analysis Methodology EFC calculations. Section 478 of...

2012-05-29

195

[Analysis of radiation-hygienic and medical consequences of the Chernobyl accident].  

PubMed

Since the day of "the Chernobyl accident" in 1986 more than 25 years have been past. Radioactively contaminated areas 14 subjects of the Russian Federation with a total area of more than 50 thousand km2, where 1.5 million people now reside were exposed to radioactive contamination. Currently, a system of comprehensive evaluation of radiation doses of the population affected by the "Chernobyl accidents", including 11 guidance documents has been created. There are methodically provided works on the assessment of average annual, accumulated and predicted radiation doses of population and its critical groups, as well as doses to the thyroid gland The relevance of the analysis of the consequences of the "Chernobyl accident" is demonstrated by the events in Japan, at nuclear power Fukusima-1. In 2011 - 20/2 there were carried out comprehensive maritime expeditions under the auspices of the Russian Geographical Society with the participation of relevant ministries and agencies, leading academic institutions in Russia. In 2012, work was carried out on radiation protection of the population from the potential transboundary impact of the accident at the Japanese nuclear power plant Fukushima-l. The results provide a basis for the favorable outlook for the radiation environment in our Far East and the Pacific coast of Russia. PMID:24340594

Onishchenko, G G

2013-01-01

196

Meta-analysis of the effect of road safety campaigns on accidents.  

PubMed

A meta-analysis of 67 studies evaluating the effect of road safety campaigns on accidents is reported. A total of 119 results were extracted from the studies, which were reported in 12 different countries between 1975 and 2007. After allowing for publication bias and heterogeneity of effects, the weighted average effect of road safety campaigns is a 9% reduction in accidents (with 95% confidence that the weighted average is between -12 and -6%). To account for the variability of effects measured across studies, data were collected to characterise aspects of the campaign and evaluation design associated with each effect, and analysed to identify a model of seven campaign factors for testing by meta-regression. The model was tested using both fixed and random effect meta-regression, and dependency among effects was accounted for by aggregation. These analyses suggest positive associations between accident reduction and the use of personal communication or roadside media as part of a campaign delivery strategy. Campaigns with a drink-driving theme were also associated with greater accident reductions, while some of the analyses suggested that accompanying enforcement and short campaign duration (less than one month) are beneficial. Overall the results are consistent with the idea that campaigns can be more effective in the short term if the message is delivered with personal communication in a way that is proximal in space and time to the behaviour targeted by the campaign. PMID:21376920

Phillips, Ross Owen; Ulleberg, Pål; Vaa, Truls

2011-05-01

197

Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H  

SciTech Connect

This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

1995-04-01

198

Segment clustering methodology for unsupervised Holter recordings analysis  

NASA Astrophysics Data System (ADS)

Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

2015-01-01

199

Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.  

SciTech Connect

The Department of Energy has assigned to Sandia National Laboratories the responsibility of producing a Safety Analysis Report (SAR) for the plutonium-dioxide fueled Multi-Mission Radioisotope Thermoelectric Generator (MMRTG) proposed to be used in the Mars Science Laboratory (MSL) mission. The National Aeronautic and Space Administration (NASA) is anticipating a launch in fall of 2009, and the SAR will play a critical role in the launch approval process. As in past safety evaluations of MMRTG missions, a wide range of potential accident conditions differing widely in probability and seventy must be considered, and the resulting risk to the public will be presented in the form of probability distribution functions of health effects in terms of latent cancer fatalities. The basic descriptions of accident cases will be provided by NASA in the MSL SAR Databook for the mission, and on the basis of these descriptions, Sandia will apply a variety of sophisticated computational simulation tools to evaluate the potential release of plutonium dioxide, its transport to human populations, and the consequent health effects. The first step in carrying out this project is to evaluate the existing computational analysis tools (computer codes) for suitability to the analysis and, when appropriate, to identify areas where modifications or improvements are warranted. The overall calculation of health risks can be divided into three levels of analysis. Level A involves detailed simulations of the interactions of the MMRTG or its components with the broad range of insults (e.g., shrapnel, blast waves, fires) posed by the various accident environments. There are a number of candidate codes for this level; they are typically high resolution computational simulation tools that capture details of each type of interaction and that can predict damage and plutonium dioxide release for a range of choices of controlling parameters. Level B utilizes these detailed results to study many thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald (Sala & Associates); Bessette, Gregory Carl; Lipinski, Ronald J.

2006-09-01

200

Framatome-ANP France UO{sub 2} fuel fabrication - criticality safety analysis in the light of the 1999' Tokay Mura accident  

SciTech Connect

In France the 1999' Tokai Mura criticality accident in Japan had a big impact on the nuclear fuel manufacturing facility community. Moreover this accident led to a large public discussion about all the nuclear facilities. The French Safety Authorities made strong requirements to the industrials to revisit completely their safety analysis files mainly those concerning nuclear fuels treatments. The Framatome-ANP production of its French low enriched (5 w/o) UO{sub 2} fuel fabrication plant (FBFC/Romans) exceeds 1000 metric tons a year. Special attention was given to the emergency evacuation plan that should be followed in case of a criticality accident. If a criticality accident happens, site internal and external radioprotection requirements need to have an emergency evacuation plan showing the different routes where the absorbed doses will be as low as possible for people. The French Safety Authorities require also an update of the old based neutron source term accounting for state of the art methodology. UO{sub 2} blenders units contain a large amount of dry powder strictly controlled by moderation; a hypothetical water leakage inside one of these apparatus is simulated by increasing the water content of the powder. The resulted reactivity insertion is performed by several static calculations. The French IRSN/CEA CRISTAL codes are used to perform these static calculations. The kinetic criticality code POWDER simulates the power excursion versus time and determines the consequent total energy source term. MNCP4B performs the source term propagation (including neutrons and gamma) used to determine the isodose curves needed to define the emergency evacuation plant. This paper deals with the approach Framatome-ANP has taken to assess Safety Authorities demands using the more up to date calculation tools and methodology. (authors)

Doucet, M.; Zheng, S. [Framatome-ANP Fuel Technology Service (France); Mouton, J.; Porte, R. [Framatome-ANP Fuel Fabrication Plant - FBFC (France)

2004-07-01

201

Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D. [GRAM, Inc., Albuquerque, NM (United States); McKay, M.D. [Los Alamos National Lab., NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

202

Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model  

SciTech Connect

Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Johnson, J.D.; Rollstin, J.A. [Gram, Inc., Albuquerque, NM (United States); Shiver, A.W.; Sprung, J.L. [Sandia National Labs., Albuquerque, NM (United States)

1995-01-01

203

Dislocation Dynamics using Anisotropic Elasticity: Methodology and Analysis  

SciTech Connect

A numerical methodology to incorporate anisotropic elasticity into three-dimensional dislocation dynamics codes has been developed, employing theorems derived by Lothe (1967), Brown (1967), Indenbom and Orlov (1968) and Asaro and Barnett (1976). The formalism is based on the stress field solution for a straight dislocation segment of arbitrqq orientation in 3-dimensional space. The general solution is given in a complicated closed integral form. To reduce the computation complexity, look-up tables are used to avoid heavy computations for the evaluation of the angular stress factor ({Sigma}{sub ij}) and its first derivative term ({Sigma}{sub ij}). The computation methodology and error analysis are discussed in comparison with known closed form solutions for isotropic elasticity. For the case of Mo single crystals, it is shown that the difference between anisotropic and isotropic elastic stress fields can be as high as 15% close to the dislocation line, and decreases significantly far away from it. This suggests that short-range interactions should be evaluated based on anisotropic elasticity, while long-range interaction can be approximated using isotropic elasticity.

Rhee, M; Stolken, J S; Bulatov, V V; Diaz de la Rubia, T; Zbib, H M; Hirth, J P

2000-06-15

204

Space Shuttle Columbia Post-Accident Analysis and Investigation  

NASA Technical Reports Server (NTRS)

Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

McDanels, Steven J.

2006-01-01

205

Computational methodology for ChIP-seq analysis  

PubMed Central

Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

2015-01-01

206

Speech analysis as an index of alcohol intoxication--the Exxon Valdez accident.  

PubMed

As part of its investigation of the EXXON VALDEZ tankship accident and oil spill, the National Transportation Safety Board (NTSB) examined the master's speech for alcohol-related effects. Recorded speech samples were obtained from marine radio communications tapes. The samples were tested for four effects associated with alcohol consumption is available scientific literature: slowed speech, speech errors, misarticulation of difficult sounds ("slurring"), and audible changes in speech quality. It was found that speech immediately before and after the accident displayed large changes of the sort associated with alcohol consumption. These changes were not readily explained by fatigue, psychological stress, drug effects, or medical problems. Speech analysis appears to be a useful technique to provide secondary evidence of alcohol impairment. PMID:1930083

Brenner, M; Cash, J R

1991-09-01

207

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1998-04-01

208

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

209

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands); Grupa, J.B. [Netherlands Energy Research Foundation (Netherlands)

1997-12-01

210

Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)  

SciTech Connect

This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

Whitehead, D. [Sandia National Labs., Albuquerque, NM (United States); Darby, J. [Science and Engineering Associates, Inc., Albuquerque, NM (United States); Yakle, J. [Science Applications International Corp., Albuquerque, NM (United States)] [and others

1994-06-01

211

analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.  

PubMed

The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

2015-05-01

212

Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)  

SciTech Connect

The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

Johnson, E.W.

1988-10-01

213

Verification of fire and explosion accident analysis codes (facility design and preliminary results)  

SciTech Connect

For several years, the US Nuclear Regulatory Commission has sponsored the development of methods for improving capabilities to analyze the effects of postulated accidents in nuclear facilities; the accidents of interest are those that could occur during nuclear materials handling. At the Los Alamos National Laboratory, this program has resulted in three computer codes: FIRAC, EXPAC, and TORAC. These codes are designed to predict the effects of fires, explosions, and tornadoes in nuclear facilities. Particular emphasis is placed on the movement of airborne radioactive material through the gaseous effluent treatment system of a nuclear installation. The design, construction, and calibration of an experimental ventilation system to verify the fire and explosion accident analysis codes are described. The facility features a large industrial heater and several aerosol smoke generators that are used to simulate fires. Both injected thermal energy and aerosol mass can be controlled using this equipment. Explosions are simulated with H/sub 2//O/sub 2/ balloons and small explosive charges. Experimental measurements of temperature, energy, aerosol release rates, smoke concentration, and mass accumulation on HEPA filters can be made. Volumetric flow rate and differential pressures also are monitored. The initial experiments involve varying parameters such as thermal and aerosol rate and ventilation flow rate. FIRAC prediction results are presented. 10 figs.

Gregory, W.S.; Nichols, B.D.; Talbott, D.V.; Smith, P.R.; Fenton, D.L.

1985-01-01

214

Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications  

SciTech Connect

Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

VINCENT, ANDREW

2005-04-25

215

Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident  

SciTech Connect

An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

Aldrich, D.C.; Blond, R.M.

1980-01-01

216

Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.  

PubMed

The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. PMID:23973170

Underwood, Peter; Waterson, Patrick

2014-07-01

217

ADHD and relative risk of accidents in road traffic: a meta-analysis.  

PubMed

The present meta-analysis is based on 16 studies comprising 32 results. These studies provide sufficient data to estimate relative accident risks of drivers with ADHD. The overall estimate of relative risk for drivers with ADHD is 1.36 (95% CI: 1.18; 1.57) without control for exposure, 1.29 (1.12; 1.49) when correcting for publication bias, and 1.23 (1.04; 1.46) when controlling for exposure. A relative risk (RR) of 1.23 is exactly the same as found for drivers with cardiovascular diseases. The long-lasting assertion that "ADHD-drivers have an almost fourfold risk of accident compared to non-ADHD-drivers", which originated from Barkley et al.'s study of 1993, is rebutted. That estimate was associated with comorbid Oppositional Defiant Disorder (ODD) and/or Conduct Disorder (CD), not with ADHD, but the assertion has incorrectly been maintained for two decades. The present study provides some support for the hypothesis that the relative accident risk of ADHD-drivers with comorbid ODD, CD and/or other conduct problems, is higher than that of ADHD-drivers without these comorbidities. The estimated RRs were 1.86 (1.27; 2.75) in a sample of ADHD-drivers in which a majority had comorbid ODD and/or CD compared to 1.31 (0.96; 1.81) in a sample of ADHD-drivers with no comorbidity. Given that ADHD-drivers most often seem to drive more than controls, and the fact that a majority of the present studies lack information about exposure, it seems more probable that the true RR is lower rather than higher than 1.23. Also the assertion that ADHD-drivers violate traffic laws more often than other drivers should be modified: ADHD-drivers do have more speeding violations, but no more drunk or reckless driving citations than drivers without ADHD. All accident studies included in the meta-analysis fail to acknowledge the distinction between deliberate violations and driving errors. The former are known to be associated with accidents, the latter are not. A hypothesis that ADHD-drivers speed more frequently than controls because it stimulates attention and reaction time is suggested. PMID:24238842

Vaa, Truls

2014-01-01

218

14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures  

Code of Federal Regulations, 2014 CFR

...used in the toxic risk analysis and a description...the process hazards analysis required by paragraph...or the statistical risk management requirements...methodology provided in the Risk Management Plan...Offsite Consequence Analysis Guidance,...

2014-01-01

219

14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures  

Code of Federal Regulations, 2013 CFR

...used in the toxic risk analysis and a description...the process hazards analysis required by paragraph...or the statistical risk management requirements...methodology provided in the Risk Management Plan...Offsite Consequence Analysis Guidance,...

2013-01-01

220

Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document  

SciTech Connect

Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

Johnson, E.W.

1985-10-01

221

Differences in rural and urban driver-injury severities in accidents involving large-trucks: An exploratory analysis  

Microsoft Academic Search

This study explores the differences between urban and rural driver injuries (both passenger-vehicle and large-truck driver injuries) in accidents that involve large trucks (in excess of 10,000 pounds). Using 4 years of California accident data, and considering four driver-injury severity categories (no injury, complaint of pain, visible injury, and severe\\/fatal injury), a multinomial logit analysis of the data was conducted.

Ahmad Khorashadi; Debbie Niemeier; Venky Shankar; Fred Mannering

2005-01-01

222

The Analysis of PWR SBO Accident with RELAP5 Based on Linux  

NASA Astrophysics Data System (ADS)

RELAP5 is a relatively advanced light water reactor transient hydraulic and thermal analysis code, and it owns the signality of the safe-operating of nuclear reactor system when the safety analysis and operating simulation of the system was done with RELAP5. The RELAP5 operating mode based on Linux operating system was presented in this paper, utilizing Linux operating system's powerful document processing capabilities to deal with the output file of the RELAP5 for the valid data directly, and taking advantage of the system's programmable capabilities to improve the drawing functions of RELAP5. After the operating in Linux system, the precision of the calculating results is guaranteed and the period of the computing is shortened. During the work, for PWR Station Blackout (SBO) accident, the computing with RELAP5 based on Linux and Windows was respectively made. Through the comparison and analysis of the accident response curve of the main parameters such as power of nuclear reactor, average temperature and pressure of primary loop, it shows the operating analysis of nuclear reactor system is safe and reliable with RELAP5 based on Linux.

Xia, Zhimin; Zhang, Dafa

223

Methods for Detector Placement and Analysis of Criticality Accident Alarm Systems  

SciTech Connect

Determining the optimum placement to minimize the number of detectors for a criticality accident alarm system (CAAS) in a large manufacturing facility is a complex problem. There is typically a target for the number of detectors that can be used over a given zone of the facility. A study to optimize detector placement typically begins with some initial guess at the placement of the detectors and is followed by either predictive calculations of accidents at specific locations or adjoint calculations based on preferred detector locations. Within an area of a facility, there may be a large number of potential criticality accident sites. For any given placement of the detectors, the list of accident sites can be reduced to a smaller number of locations at which accidents may be difficult for detectors to detect. Developing the initial detector placement and determining the list of difficult accident locations are both based on the practitioner's experience. Simulations following fission particles released from an accident location are called 'forward calculations.' These calculations can be used to answer the question 'where would an alarm be triggered?' by an accident at a specified location. Conversely, 'adjoint calculations' start at a detector site using the detector response function as a source and essentially run in reverse. These calculations can be used to answer the question 'where would an accident be detected?' by a specified detector location. If the number of accidents, P, is much less than the number of detectors, Q, then forward simulations may be more convenient and less time-consuming. If Q is large or the detectors are not placed yet, then a mesh tally of dose observed by a detector at any location must be computed over the entire zone. If Q is much less than P, then adjoint calculations may be more efficient. Adjoint calculations employing a mesh tally can be even more advantageous because they do not rely on a list of specific difficult-to-detect accident sites, which may not have included every possible accident location. Analog calculations (no biasing) simply follow particles naturally. For sparse buildings and line-of-sight calculations, analog Monte Carlo (MC) may be adequate. For buildings with internal walls or large amounts of heavy equipment (dense geometry), variance reduction may be required. Calculations employing the CADIS method use a deterministic calculation to create an importance map and a matching biased source distribution that optimize the final MC to quickly calculate one specific tally. Calculations employing the FW-CADIS method use two deterministic calculations (one forward and one adjoint) to create an importance map and a matching biased source distribution that are designed to make the MC calculate a mesh tally with more uniform uncertainties in both high-dose and low-dose areas. Depending on the geometry of the problem, the number of detectors, and the number of accident sites, different approaches to CAAS placement studies can be taken. These are summarized in Table I. SCALE 6.1 contains the MAVRIC sequence, which can be used to perform any of the forward-based approaches outlined in Table I. For analog calculations, MAVRIC simply calls the Monaco MC code. For CADIS and FW-CADIS, MAVRIC uses the Denovo discrete ordinates (SN) deterministic code to generate the importance map and biased source used by Monaco. An adjoint capability is currently being added to Monaco and should be available in the next release of SCALE. An adjoint-based approach could be performed with Denovo alone - although fine meshes, large amounts of memory, and long computation times may be required to obtain accurate solutions. Coarse-mesh SN simulations could be employed for adjoint-based scoping studies until the adjoint capability in Monaco is complete. CAAS placement studies, especially those dealing with mesh tallies, require some extra utilities to aid in the analysis. Detectors must receive a minimum dose rate in order to alarm; therefore, a simple yes/no plot could be more useful to the analyst t

Peplow, Douglas E. [ORNL] [ORNL; Wetzel, Larry [Babcock & Wilcox Nuclear Operations Group Inc.] [Babcock & Wilcox Nuclear Operations Group Inc.

2012-01-01

224

A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?  

NASA Technical Reports Server (NTRS)

In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

Holloway, C. M.; Johnson, C. W.

2007-01-01

225

Why-Because Analysis (WBA) An accident investigation attempts to determine the network of events and circumstances  

E-print Network

Analysis of WBA has been performed, and a procedural guide is available as a series of flowcharts. WBA has been adopted as a company-wide required procedure for the analysis of product defects by SiemensWhy-Because Analysis (WBA) An accident investigation attempts to determine the network of events

Moeller, Ralf

226

Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident  

SciTech Connect

The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

Pasichnyk, I.; Perin, Y.; Velkov, K. [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)

2013-07-01

227

Methodology for error analysis and simulation of lidar aerosol measurements  

Microsoft Academic Search

We present a methodology for objective and automated determination of the uncertainty in aerosol measure- ments made by lidar. The methodology is based on standard error-propagation procedures, a large data base on atmospheric behavior, and considerable experience in processing lidar data. It yields algebraic ex- pressions for probable error as a function of the atmospheric, background lighting, and lidar parameters.

Philip B. Russell; Thomas J. Swissler; M. Patrick McCormick

228

An Analysis of "HCR"'s Theoretical and Methodological Evolution.  

ERIC Educational Resources Information Center

Traces theoretical and methodological contributions this journal has made to the field of communication. Finds that the journal has matured both theoretically and methodologically: 59% of the articles in the first 24 years had a theory or model driving the research; the journal remains almost exclusively quantitative; and no single theory or…

Violanti, Michelle T.

1999-01-01

229

Hazardous materials transportation: a risk-analysis-based routing methodology.  

PubMed

This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed. PMID:10677666

Leonelli, P; Bonvicini, S; Spadoni, G

2000-01-01

230

A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test  

NASA Astrophysics Data System (ADS)

According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis

Tabibzadeh, Maryam

231

NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug  

NASA Technical Reports Server (NTRS)

A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a cleavage-type failure. For the accident case, the predicted failure load for the right rear lug from the PFA is greater than 1.98 times the limit load of the lugs. I.

Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

2005-01-01

232

Bayesian data analysis of severe fatal accident risk in the oil chain.  

PubMed

We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. PMID:22642363

Eckle, Petrissa; Burgherr, Peter

2013-01-01

233

Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR  

SciTech Connect

The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T. [Japan Nuclear Energy Safety Organization JNES, Toranomon Towers Office, 4-1-28, Toranomon, Minato-ku, Tokyo (Japan); Shirakawa, N. [Inst. of Applied Energy IAE, Shimbashi SY Bldg., 14-2 Nishi-Shimbashi 1-Chome, Minato-ku, Tokyo (Japan)

2012-07-01

234

Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.  

PubMed

Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. PMID:21094332

Ju, Yong Han; Sohn, So Young

2011-01-01

235

Process hazards analysis (PrHA) program, bridging accident analyses and operational safety  

SciTech Connect

Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker safety are incorporated so the worker can readily identify the safety parameters of the their work. System safety tools such as Preliminary Hazard Analysis, What-If Analysis, Hazard and Operability Analysis as well as other techniques as necessary provide the groundwork for both determining bounding conditions for facility safety, operational safety, and day-to-clay worker safety.

Richardson, J. A. (Jeanne A.); McKernan, S. A. (Stuart A.); Vigil, M. J. (Michael J.)

2003-01-01

236

Asset Analysis of Risk Assessment for IEC 61850Based Power Control Systems—Part I: Methodology  

Microsoft Academic Search

Information security risk assessment of IEC 61850-based power control systems is currently an unsolved problem. One of the reasons is a lack of methodology for asset analysis, which is an important process of risk assessment. As the features of IEC 61850-based power control systems are different from general IT systems, a specific methodology of asset analysis is introduced. Based on

Nian Liu; Jianhua Zhang; Xu Wu

2011-01-01

237

Industrial accidents triggered by earthquakes, floods and lightning: lessons learned from a database analysis  

Microsoft Academic Search

Natural hazards and disasters can cause major accidents in chemical and process installations. These so-called Natech accidents\\u000a can result in hazardous-materials releases due to damage to process and storage units, or pipes. In order to understand the\\u000a dynamics of Natech events, accidents triggered by earthquakes, floods and lightning recorded in industrial accident databases\\u000a were analysed. This allowed the identification of

Elisabeth Krausmann; Elisabetta Renni; Michela Campedel; Valerio Cozzani

238

Effects of improved modeling on best estimate BWR severe accident analysis  

SciTech Connect

Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H/sub 2/O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B/sub 4/C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table.

Hyman, C.R.; Ott, L.J.

1984-01-01

239

Occupational vehicular accident claims: A workers’ compensation analysis of Oregon truck drivers 1990–1997  

Microsoft Academic Search

This study used workers’ compensation data from Oregon from 1990 to 1997 to examine workers’ compensation claims from vehicular accidents by truck drivers, and to calculate claim rate estimates using baseline data derived from the US Bureau of Census’ Current Population Surveys. During this period, 1168 valid injury claims due to vehicular accidents were filed representing an accident claim rate

Brian P. McCall; Irwin B. Horwitz

2005-01-01

240

Simulation and Analysis of Typical Winter Road Traffic Accidents in China  

Microsoft Academic Search

There are many winter road traffic accidents in China, and the Death Toll is on the top in the world for a long time, where, the traffic accidents under the condition of ice & snow road have some special characters. This article take the typical case about traffic accidents of ice & snow road as an example, make the reconstruction

Ding Tongqiang; Han Xiufeng; Zheng Lili

2010-01-01

241

A Content Analysis of News Media Coverage of the Accident at Three Mile Island.  

ERIC Educational Resources Information Center

A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

Stephens, Mitchell; Edison, Nadyne G.

242

Analysis of the SL-1 Accident Using RELAPS5-3D  

SciTech Connect

On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

Francisco, A.D. and Tomlinson, E. T.

2007-11-08

243

Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks  

SciTech Connect

Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.

Bri Rolston

2005-06-01

244

Methodologies for analysis of patterning in the mouse RPE sheet  

PubMed Central

Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer-analyzed results were compared. Whether tallied manually or automatically with software, the resulting cell measurements were in close agreement. We compared normal with diseased RPE cells during aging with quantitative cell size and shape metrics. Subtle differences between the RPE sheet characteristics of young and old mice were identified. The IRBP?/? mouse RPE sheet did not differ from C57BL/6J (wild type, WT), suggesting that IRBP does not play a direct role in maintaining the health of the RPE cell, while the slow loss of photoreceptor (PhR) cells previously established in this knockout does support a role in the maintenance of PhR cells. Rd8 mice exhibited several measurable changes in patterns of RPE cells compared to WT, suggesting a slow degeneration of the RPE sheet that had not been previously noticed in rd8. Conclusions An optimized dissection method and a series of programs were used to establish a rapid and hands-off analysis. The software-aided, high-sampling-size approach performed as well as trained human scorers, but was considerably faster and easier. This method allows tens to hundreds of thousands of cells to be analyzed, each with 23 metrics. With this combination of dissection and image analysis of the RPE sheet, we can now analyze cell-to-cell interactions of immediate neighbors. In the future, we may be able to observe interactions of second, third, or higher ring neighbors and analyze tension in sheets, which might be expected to deviate from normal near large bumps in the RPE sheet caused by druse or when large frank holes in the RPE sheet are observed in geographic atrophy. This method and software can be readily applied to other aspects of vision science, neuroscience, and epithelial biology where patterns may exist in a sheet or surface of cells. PMID:25593512

Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

2015-01-01

245

Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report  

SciTech Connect

The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

1986-09-01

246

Of Disasters and Dragon Kings: A Statistical Analysis of Nuclear Power Incidents & Accidents  

E-print Network

We provide, and perform a risk theoretic statistical analysis of, a dataset that is 75 percent larger than the previous best dataset on nuclear incidents and accidents, comparing three measures of severity: INES (International Nuclear Event Scale), radiation released, and damage dollar losses. The annual rate of nuclear accidents, with size above 20 Million US$, per plant, decreased from the 1950s until dropping significantly after Chernobyl (April, 1986). The rate is now roughly stable at 0.002 to 0.003, i.e., around 1 event per year across the current fleet. The distribution of damage values changed after Three Mile Island (TMI; March, 1979), where moderate damages were suppressed but the tail became very heavy, being described by a Pareto distribution with tail index 0.55. Further, there is a runaway disaster regime, associated with the "dragon-king" phenomenon, amplifying the risk of extreme damage. In fact, the damage of the largest event (Fukushima; March, 2011) is equal to 60 percent of the total damag...

Wheatley, Spencer; Sornette, Didier

2015-01-01

247

Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

248

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1998-04-01

249

Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

1997-12-01

250

DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS  

SciTech Connect

Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

Wu, T

2008-04-30

251

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

252

PWR loss-of-coolant accident analysis capability of the WRAP-EM system  

SciTech Connect

The modular computational system known as the Water Reactor Analysis Package (WRAP) has been extended to provide the computational tools required to perform a complete analysis of loss-of-coolant accidents (LOCAs) in pressurized water reactors (PWR). The new system is known as the WRAP-EM (Evaluation Model) system and will be used by NRC to interpret and evaluate reactor vendor EM methods and computed results. The system for PWR-EM analysis is comprised of several computer codes which have been developed to analyze a particular phase of a LOCA. These codes include GAPCON for calculation of initial fuel conditions, WRAP (the previously developd SRL analog of RELAP4/MOD5) for analysis of the system blowdown and refill, the FLOOD option in WRAP for analysis of the reflood phase, and FRAP for the calculation of the behavior of the hot fuel pin. In addition, a PWR steady-state initialization procedure has been developed to provide the initial operating state of the reactor system. The PWR-EM system is operational and is being evaluated to determine the adequacy and consistency of the physical models employed for EM analysis.

Gregory, M.V.; Beranek, F.

1980-08-01

253

Accident management information needs  

SciTech Connect

In support of the US Nuclear Regulatory Commission (NRC) Accident Management Research Program, a methodology has been developed for identifying the plant information needs necessary for personnel involved in the management of an accident to diagnose that an accident is in progress, select and implement strategies to prevent or mitigate the accident, and monitor the effectiveness of these strategies. This report describes the methodology and presents an application of this methodology to a Pressurized Water Reactor (PWR) with a large dry containment. A risk-important severe accident sequence for a PWR is used to examine the capability of the existing measurements to supply the necessary information. The method includes an assessment of the effects of the sequence on the measurement availability including the effects of environmental conditions. The information needs and capabilities identified using this approach are also intended to form the basis for more comprehensive information needs assessment performed during the analyses and development of specific strategies for use in accident management prevention and mitigation. 3 refs., 16 figs., 7 tabs.

Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. (EG and G Idaho, Inc., Idaho Falls, ID (USA))

1990-04-01

254

The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents  

NASA Technical Reports Server (NTRS)

In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

Ancel, Ersin; Shih, Ann T.

2012-01-01

255

A Mixed Methodology Analysis of Co-Teacher Assessments  

ERIC Educational Resources Information Center

A mixed methodology approach was used to study the relationship between general and special educators who were co-teaching. On two co-teacher assessment instruments, sums of ratings from special educators and general elementary and secondary educators in an urban multicultural school district in the southeastern USA were similar to those obtained…

Cramer, Elizabeth; Nevin, Ann

2006-01-01

256

A Methodology for Architecture-Level Reliability Risk Analysis  

Microsoft Academic Search

Abstract?isk assessment is an essential process of every software risk management plan. Several risk assessment techniques are based on the subjective judgement of domain experts. Subjective risk assessment techniques are human intensive and error-prone. Risk assessment should be based on product attributes that we can quantitatively measure using product metrics. This paper presents a methodology for reliability risk assessment at

Sherif M. Yacoub; Hany H. Ammar

2002-01-01

257

Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis  

ERIC Educational Resources Information Center

The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

Abdallah, Mahmoud Mohammad Sayed

2009-01-01

258

Analysis and testing of the SELECS methodology, and users guide to SELECS software. Final report  

SciTech Connect

The purpose of this report is twofold; first, to describe the analysis and testing of the SELECS Methodology for anomalies, and secondly, to present a users guide for the software implementation of the SELECS Methodology. The report consists of two major sections. The first section contains: a mathematical analysis of the SELECS Methodology, in which expressions are derived for writing values of site and process score sensitivity; a description of the software for testing the SELECS Methodology; and description and analysis of computer test runs of the SELECS Methodology, using randomized data for 167 test runs. The second section, which in the users guide contains: a description of the software programs SELECS, and CURVIA; and instructions for executing SELELS, CURVIA, and SELRND.

Kramek, R G; Purcupile, J C

1982-01-01

259

Accident safety analysis for 300 Area N Reactor Fuel Fabrication and Storage Facility  

SciTech Connect

The purpose of the accident safety analysis is to identify and analyze a range of credible events, their cause and consequences, and to provide technical justification for the conclusion that uranium billets, fuel assemblies, uranium scrap, and chips and fines drums can be safely stored in the 300 Area N Reactor Fuel Fabrication and Storage Facility, the contaminated equipment, High-Efficiency Air Particulate filters, ductwork, stacks, sewers and sumps can be cleaned (decontaminated) and/or removed, the new concretion process in the 304 Building will be able to operate, without undue risk to the public, employees, or the environment, and limited fuel handling and packaging associated with removal of stored uranium is acceptable.

Johnson, D.J.; Brehm, J.R.

1994-01-01

260

Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor  

SciTech Connect

This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

1992-10-01

261

Analysis of Radionuclide Releases from the Fukushima Dai-Ichi Nuclear Power Plant Accident Part I  

NASA Astrophysics Data System (ADS)

Part I of this publication deals with the analysis of fission product releases consecutive to the Fukushima Dai-ichi accident. Reactor core damages are assessed relying on radionuclide detections performed by the CTBTO radionuclide network, especially at the particulate station located at Takasaki, 210 km away from the nuclear power plant. On the basis of a comparison between the reactor core inventory at the time of reactor shutdowns and the fission product activities measured in air at Takasaki, especially 95Nb and 103Ru, it was possible to show that the reactor cores were exposed to high temperature for a prolonged time. This diagnosis was confirmed by the presence of 113Sn in air at Takasaki. The 133Xe assessed release at the time of reactor shutdown (8 × 1018 Bq) turned out to be in the order of 80 % of the amount deduced from the reactor core inventories. This strongly suggests a broad meltdown of reactor cores.

Le Petit, G.; Douysset, G.; Ducros, G.; Gross, P.; Achim, P.; Monfort, M.; Raymond, P.; Pontillon, Y.; Jutier, C.; Blanchard, X.; Taffary, T.; Moulin, C.

2014-03-01

262

A rational design change methodology based on experimental and analytical modal analysis  

SciTech Connect

A design methodology that integrates analytical modeling and experimental characterization is presented. This methodology represents a powerful tool for making rational design decisions and changes. An example of its implementation in the design, analysis, and testing of a precisions machine tool support structure is given.

Weinacht, D.J.; Bennett, J.G.

1993-08-01

263

Learning from Teachable Moments: Methodological Lessons from the Secondary Analysis of the TIMSS Video Study.  

ERIC Educational Resources Information Center

The Secondary Analysis of TIMSS (Third International Mathematics and Science Study) Video Data study used TIMSS Videotape Classroom Study data and vPrism software to achieve methodological and substantive findings on teacher quality, instructional practices, and classroom interactions. It discussed methodological and technical issues arising…

Arafeh, Sousan; Smerdon, Becky; Snow, Stephanie

264

Cost-Effectiveness Analysis of Prenatal Diagnosis: Methodological Issues and Concerns  

Microsoft Academic Search

With increasing concerns regarding rapidly expanding health care costs, cost-effectiveness analysis (CEA) provides a methodology to assess whether marginal gains from new technology are worth the increased costs. In the arena of prenatal diagnosis, particular methodological and ethical concerns include whether the effects of such testing on individuals other than the patient are included, how termination of pregnancy is included

Aaron B. Caughey

2005-01-01

265

Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document  

SciTech Connect

The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

Not Available

1988-12-15

266

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

1995-01-01

267

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

1995-01-01

268

Full-Envelope Launch Abort System Performance Analysis Methodology  

NASA Technical Reports Server (NTRS)

The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

Aubuchon, Vanessa V.

2014-01-01

269

Assessment of scaling uncertainties for PWR plant large-break LOCA (loss-of-coolant accident) analysis  

Microsoft Academic Search

This report assesses the effect of scale on key processes which affect the large loss-of-coolant accident (LOCA), in order to evaluate the uncertainty in computer codes used to model LOCA phenomena. First, the important processes are identified, based on experience gained from analysis of the PWR LOCA using realistic computer models. Experiments which simulate these processes at different scales are

S. K. Chow; L. E. Hochreiter; H. C. Yeh; M. Y. Young

1989-01-01

270

Injuries to New Zealanders participating in adventure tourism and adventure sports: an analysis of Accident Compensation Corporation (ACC) claims  

Microsoft Academic Search

Aims The aim of this study was to examine the involvement of adventure tourism and adventure sports activity in injury claims made to the Accident Compensation Corporation (ACC). Methods Epidemiological analysis of ACC claims for the period, July 2004 to June 2005, where adventure activities were involved in the injury. Results 18,697 adventure tourism and adventure sports injury claims were

Tim Bentley; Keith Mack; Jo Edwards

271

A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains  

SciTech Connect

This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

Burgherr, P.; Hirschberg, S. [Paul Scherrer Institute, Villigen (Switzerland)

2008-07-01

272

Assessment of ISLOCA risk-methodology and application to a combustion engineering plant  

SciTech Connect

Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISOLOCA core damage frequency and risk. This report presents a detailed of description of the application of this analysis methodology to a Combustion Engineering plant.

Kelly, D.L.; Auflick, J.L.; Haney, L.N. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

1992-04-01

273

Building Energy Performance Analysis of an Academic Building Using IFC BIM-Based Methodology  

E-print Network

This paper discusses the potential to use an Industry Foundation Classes (IFC)/Building Information Modelling (BIM) based method to undertake Building Energy Performance analysis of an academic building. BIM/IFC based methodology provides a...

Aziz, Z.; Arayici, Y.; Shivachev, D.

2012-01-01

274

Development and validation of a generalised engineering methodology for thermal analysis of structural members in fire   

E-print Network

A novel methodology for generalising CFD-based approaches for thermal analysis of protected steelwork in fire has been developed, known as GeniSTELA. This is a quasi-3D approach with computation of a "steel temperature ...

Liang, Hong; Welch, Stephen; Stratford, Tim J; Kinsella, Emmett V

275

78 FR 29353 - Federal Need Analysis Methodology for the 2014-15 Award Year-Federal Pell Grant, Federal Perkins...  

Federal Register 2010, 2011, 2012, 2013, 2014

...DEPARTMENT OF EDUCATION Federal Need Analysis Methodology for the 2014-15 Award Year-- Federal...the statutory Federal Need Analysis Methodology that determines a student's expected...Department uses in the Federal Need Analysis Methodology to determine the EFC. Section 478...

2013-05-20

276

TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events  

SciTech Connect

The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

2013-11-10

277

Insights from review and analysis of the Fukushima Dai-ichi accident  

Microsoft Academic Search

An unprecedented earthquake and tsunami struck the Fukushima Dai-ichi Nuclear Power Plants on 11 March 2011. Although extensive efforts have been continuing on investigations into the causes and consequences of the accident, and the Japanese Government has presented a comprehensive report on the accident in the IAEA Ministerial Conference held in June 2011, there is still much to be clarified

Masashi Hirano; Taisuke Yonomoto; Masahiro Ishigaki; Norio Watanabe; Yu Maruyama; Yasuteru Sibamoto; Tadashi Watanabe; Kiyofumi Moriyama

2012-01-01

278

PRESENTATION OF WORK PERFORMED BY ESReDA "ACCIDENT ANALYSIS" WORKING  

E-print Network

on accidents to define a safety policy in relationship with regulations, insurance, economic and social costs will be devoted to issuing a review document of pertinent databases on accidents developed in Europe in order and to the regulations for power plants and transportation. It could also be assumed that any database System

Paris-Sud XI, Université de

279

Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design  

NASA Technical Reports Server (NTRS)

A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

2004-01-01

280

Operator error and system deficiencies: analysis of 508 mining incidents and accidents from Queensland, Australia using HFACS.  

PubMed

Historically, mining has been viewed as an inherently high-risk industry. Nevertheless, the introduction of new technology and a heightened concern for safety has yielded marked reductions in accident and injury rates over the last several decades. In an effort to further reduce these rates, the human factors associated with incidents/accidents needs to be addressed. A modified version of the Human Factors Analysis and Classification System was used to analyze incident and accident cases from across the state of Queensland to identify human factor trends and system deficiencies within mining. An analysis of the data revealed that skill-based errors were the most common unsafe act and showed no significant differences across mine types. However, decision errors did vary across mine types. Findings for unsafe acts were consistent across the time period examined. By illuminating human causal factors in a systematic fashion, this study has provided mine safety professionals the information necessary to reduce mine incidents/accidents further. PMID:20441855

Patterson, Jessica M; Shappell, Scott A

2010-07-01

281

Application of 3D documentation and geometric reconstruction methods in traffic accident analysis: With high resolution surface scanning, radiological MSCT\\/MRI scanning and real data based animation  

Microsoft Academic Search

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact.Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the

Ursula Buck; Silvio Naether; Marcel Braun; Stephan Bolliger; Hans Friederich; Christian Jackowski; Emin Aghayev; Andreas Christe; Peter Vock; Richard Dirnhofer; Michael J. Thali

2007-01-01

282

Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation  

NASA Technical Reports Server (NTRS)

A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

Campbell, Charles H.

2004-01-01

283

Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter  

NASA Technical Reports Server (NTRS)

Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

2010-01-01

284

Interpretation methodology and analysis of in-flight lightning data  

NASA Technical Reports Server (NTRS)

A methodology is presented whereby electromagnetic measurements of inflight lightning stroke data can be understood and extended to other aircraft. Recent measurements made on the NASA F106B aircraft indicate that sophisticated numerical techniques and new developments in corona modeling are required to fully understand the data. Thus the problem is nontrivial and successful interpretation can lead to a significant understanding of the lightning/aircraft interaction event. This is of particular importance because of the problem of lightning induced transient upset of new technology low level microcircuitry which is being used in increasing quantities in modern and future avionics. Inflight lightning data is analyzed and lightning environments incident upon the F106B are determined.

Rudolph, T.; Perala, R. A.

1982-01-01

285

Task analysis of nuclear-power-plant control-room crews: project approach methodology  

Microsoft Academic Search

A task analysis of nuclear-power-plant control-room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task-analysis methodology used in the project is discussed and compared to traditional task-analysis and job-analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas:

D. Burgy; C. Lempges; A. Miller; L. Schroeder; H. Van Cott; B. Paramore

1983-01-01

286

Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology  

NASA Technical Reports Server (NTRS)

The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

Atkins, H. L.

1997-01-01

287

The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation  

NASA Technical Reports Server (NTRS)

The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

McDanels, Steven J.

2006-01-01

288

Adapting Job Analysis Methodology to Improve Evaluation Practice  

ERIC Educational Resources Information Center

This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

Jenkins, Susan M.; Curtin, Patrick

2006-01-01

289

Adapting Job Analysis Methodology to Improve Evaluation Practice  

Microsoft Academic Search

This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs administer programs and provide program services, interpretation of outcome data, make comparisons

Susan M. Jenkins; Patrick Curtin

2006-01-01

290

Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR  

SciTech Connect

In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed future analyses, particularly concerning: the neutronic burn-up scheme, the modeling of the diagrid effect and the control rod drive line expansion feed-backs, as well as the primary/secondary systems thermal-hydraulics behavior. (authors)

Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D. [EDF R and D, 1, Avenue du General de Gaulle, 92141 Clamart (France); Struwe, D.; Pfrang, W.; Ponomarev, A. [Karlsruher Institut fuer Technologie KIT, Institut fuer Neutronenphysik und Reaktortechnik INR, Hermann-von-Helmholtz-Platz 1, Gebaude 521, 76344 Eggenstein-Leopoldshafen (Germany)

2012-07-01

291

Multi-Attribute Decision Theory methodology for pollution control measure analysis  

SciTech Connect

A methodology based in Multi-Attribute Decision Theory was developed to prioritize air pollution control measures and strategies (a set of measures) for Mexico City Metropolitan Area (MCMA). We have developed a framework that takes into account economic, technical feasibility, environmental, social, political, and institutional factors to evaluate pollution mitigation measures and strategies utilizing a decision analysis process. In a series of meetings with a panel of experts in air pollution from different offices of the mexican government we have developed General and Specific criteria for a decision analysis tree. With these tools the measures or strategies can be graded and a figure of merit can be assigned to each of them, so they can be ranked. Two pollution mitigation measures were analyzed to test the methodology, the results are presented. This methodology was developed specifically for Mexico City, though the experience gained in this work can be used to develop similar methodologies for other metropolitan areas throughout the world.

Barrera Roldan, A.S.; Corona Juarez, A. (Instituto Mexicano de Petroleo, Mexico City (Mexico)); Hardie, R.W.; Thayer, G.R. (Los Alamos National Lab., NM (United States))

1992-01-01

292

Multi-Attribute Decision Theory methodology for pollution control measure analysis  

SciTech Connect

A methodology based in Multi-Attribute Decision Theory was developed to prioritize air pollution control measures and strategies (a set of measures) for Mexico City Metropolitan Area (MCMA). We have developed a framework that takes into account economic, technical feasibility, environmental, social, political, and institutional factors to evaluate pollution mitigation measures and strategies utilizing a decision analysis process. In a series of meetings with a panel of experts in air pollution from different offices of the mexican government we have developed General and Specific criteria for a decision analysis tree. With these tools the measures or strategies can be graded and a figure of merit can be assigned to each of them, so they can be ranked. Two pollution mitigation measures were analyzed to test the methodology, the results are presented. This methodology was developed specifically for Mexico City, though the experience gained in this work can be used to develop similar methodologies for other metropolitan areas throughout the world.

Barrera Roldan, A.S.; Corona Juarez, A. [Instituto Mexicano de Petroleo, Mexico City (Mexico); Hardie, R.W.; Thayer, G.R. [Los Alamos National Lab., NM (United States)

1992-12-31

293

Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

Sleight, David W.

1999-01-01

294

Methodologies and techniques for analysis of network flow data  

SciTech Connect

Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

Bobyshev, A.; Grigoriev, M.; /Fermilab

2004-12-01

295

Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications  

PubMed Central

Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

Lourenço, Célia; Turner, Claire

2014-01-01

296

Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis  

NASA Technical Reports Server (NTRS)

This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

Babcock, P.; Schor, A.; Rosch, G.

1998-01-01

297

Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology  

NASA Technical Reports Server (NTRS)

The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

Knight, Norman F.

1998-01-01

298

Accident analysis of large-scale technological disasters applied to an anaesthetic complication  

Microsoft Academic Search

The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development\\u000a of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia.\\u000a We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen\\u000a describes two kinds of

Chris J. Eagle; Jan M. Davies; J. Reason

1992-01-01

299

Accident Analysis of Complex Systems Based on System Control for Safety  

Microsoft Academic Search

In modern complex systems such as chemical and nuclear plants, as its hardware system reliability increases due to the advancement\\u000a of technology, systemic failures such as software design errors become a significant contributor to system accidents. State-of-the-art\\u000a computers have made many technology-based systems so complex that new types of accidents now result from dysfunctional interactions\\u000a between system components, further adding

Takehisa Kohda

300

Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies  

Microsoft Academic Search

The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and

R. J. Budnitz; H. E. Lambert; E. E. Hill

1987-01-01

301

A methodology for the structural and functional analysis of signaling and regulatory networks  

Microsoft Academic Search

Background: Structural analysis of cellular interaction networks contributes to a deeper understanding of network-wide interdependencies, causal relationships, and basic functional capabilities. While the structural analysis of metabolic networks is a well-established field, similar methodologies have been scarcely developed and applied to signaling and regulatory networks. Results: We propose formalisms and methods, relying on adapted and partially newly introduced approaches, which

Steffen Klamt; Julio Saez-rodriguez; Jonathan A. Lindquist; Luca Simeoni; Ernst Dieter Gilles

2006-01-01

302

A Fast TCAD-based Methodology for Variation Analysis of Emerging Nano-Devices  

E-print Network

A Fast TCAD-based Methodology for Variation Analysis of Emerging Nano-Devices Hassan Ghasemzadeh--Variability analysis of nanoscale transistors and circuits is emerging as a necessity at advanced technology nodes TCAD solvers. In this paper, we propose an automated output prediction method suited for fast PV

Candea, George

303

A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children  

ERIC Educational Resources Information Center

Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

2012-01-01

304

Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.  

PubMed

A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

2005-01-01

305

Analysis of the effectiveness of emergency countermeasures in the 30-KM zone during the early phase of the chernobyl accident  

SciTech Connect

Some radiation-emergency countermeasures, including evacuation, were implemented in the settlements of the 30-km zone during the early phase of the accident at the Chernobyl Nuclear Power Plant. These countermeasures are described and compared with the international recommendations. An analysis of the effectiveness of the emergency countermeasures was conducted based upon the results of a wide-scale public survey. Quantitative assessments of the effectiveness (dose reduction) of the countermeasures were derived. 9 refs., 2 figs.

Likhtarev, I.A.; Chumack, V.V.; Repin, V.S. [Ukrainian Scientific Center of Radiation Medicine, Kiev (Ukraine)

1994-11-01

306

Recent methodology in the phytochemical analysis of ginseng.  

PubMed

This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance spectroscopy and high-resolution mass spectrometry (Fourier transform mass spectrometry) in the context of metabolomics. PMID:18058794

Angelova, Nadezhda; Kong, Hong-Wei; van der Heijden, Rob; Yang, Shih-Ying; Choi, Young Hae; Kim, Hye Kyong; Wang, Mei; Hankemeier, Thomas; van der Greef, Jan; Xu, Guowang; Verpoorte, Rob

2008-01-01

307

A fault injection analysis of Virtex FPGA TMR design methodology  

Microsoft Academic Search

This paper presents the meaningful results of a single bit upset fault injection analysis performed in Virtex FPGA triple modular redundancy (TMR) design. Each programmable bit upset able to cause an error in the TMR design has been investigated. Final conclusion using the TMR \\

F. Lima; C. Carmichaell; J. Fabula; R Padovanil; R. Reis

2001-01-01

308

Fault Tree Analysis: An emerging methodology for instructional science  

Microsoft Academic Search

Fault Tree Analysis is a systematic approach to improving the probability of succes in any system. FTA was first developed as part of the U.S. Space Industry and was applied to such programs as the Minute Man Missile evaluations. Kent G. Stephens has successfully applied the technique to instructional and administrative programs, the latest program being the development of an

R. Kent Wood; Kent G. Stephens; Bruce O. Barker

1979-01-01

309

Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications  

NASA Technical Reports Server (NTRS)

An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

Taylor, Arthur C., III; Hou, Gene W.

1996-01-01

310

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

311

Methodology of a Cladistic Analysis: How to Construct Cladograms  

NSDL National Science Digital Library

This outline details the six steps necessary for completing a cladistic analysis. The final step is to build the cladogram, following the rules that: all taxa go on the endpoints of the cladogram (never at the nodes), all cladogram nodes must have a list of synapomorphies which are common to all taxa above the nodes, and all synapomorphies appear on the cladogram only once unless the characteristic was derived separately by evolutionary parallelism. The site then explains how to test your cladogram.

312

Methodological analysis of finite helical axis behavior in cervical kinematics.  

PubMed

Although a far more stable approach compared to the six degrees of freedom analysis, the finite helical axis (FHA) struggles with interpretational difficulties among health professionals. The analysis of the 3D-motion axis has been used in clinical studies, but mostly limited to qualitative analysis. The aim of this study is to introduce a novel approach for the quantification of the FHA behavior and to investigate the effect of noise and angle intervals on the estimation of FHA parameters. A simulation of body movement has been performed introducing Gaussian noise on position and orientation of a virtual sensor showing linear relation between the simulated noise and the error in the corresponding parameter. FHA axis behavior was determined by calculating the intersection points of the FHA with a number of planes perpendicular to the FHA using the Convex Hull (CH) technique. The angle between the FHA and each of the IHA was also computed and its distribution was also analyzed. Input noise has an inversely proportional relationship with the angle steps of FHA estimation. The proposed FHA quantification approach can be useful to provide new approaches to researchers and to improve insight for the clinician in order to better understand joint kinematics. PMID:24916306

Cescon, Corrado; Cattrysse, Erik; Barbero, Marco

2014-10-01

313

APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip  

SciTech Connect

This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

Hamm, L.L.

1998-10-07

314

APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR  

SciTech Connect

This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

Hamm, L.L.

1998-10-07

315

An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.  

SciTech Connect

This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to an aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.

Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

2003-09-01

316

Immunoassay Methods and their Applications in Pharmaceutical Analysis: Basic Methodology and Recent Advances  

PubMed Central

Immunoassays are bioanalytical methods in which the quantitation of the analyte depends on the reaction of an antigen (analyte) and an antibody. Immunoassays have been widely used in many important areas of pharmaceutical analysis such as diagnosis of diseases, therapeutic drug monitoring, clinical pharmacokinetic and bioequivalence studies in drug discovery and pharmaceutical industries. The importance and widespread of immunoassay methods in pharmaceutical analysis are attributed to their inherent specificity, high-throughput, and high sensitivity for the analysis of wide range of analytes in biological samples. Recently, marked improvements were achieved in the field of immunoassay development for the purposes of pharmaceutical analysis. These improvements involved the preparation of the unique immunoanalytical reagents, analysis of new categories of compounds, methodology, and instrumentation. The basic methodologies and recent advances in immunoassay methods applied in different fields of pharmaceutical analysis have been reviewed. PMID:23674985

Darwish, Ibrahim A.

2006-01-01

317

Methodology for Establishment of Integrated Flood Analysis System  

NASA Astrophysics Data System (ADS)

Flood risk management efforts face considerable uncertainty in flood hazard delineation as a consequence of changing climatic conditions including shifts in precipitation, soil moisture, and land uses. These changes can confound efforts to characterize flood impacts over decadal time scales and thus raise questions about the true benefits and drawbacks of alternative flood management projects including those of a structural and non-structural nature. Here we report an integrated flood analysis system that is designed to bring climate change information into flood risk context and characterize flood hazards in both rural and urban areas. Distributed rainfall-runoff model, one-dimensional (1D) NWS-FLDWAV model, 1D Storm Water Management Model (SWMM) and two-dimensional (2D) BreZo model are coupled. Distributed model using the multi-directional flow allocation and real time updating is used for rainfall-runoff analysis in ungauged watershed and its outputs are taken as boundary conditions to the FLDWAV model which was employed for 1D river hydraulic routing and predicting the overflow discharge at levees which were overtopped. In addition, SWMM is chosen to analyze storm sewer flow in urban areas and BreZo is used to estimate the inundation zones, depths and velocities due to the surcharge flow at sewer system or overflow at levees on the land surface. The overflow at FLDWAV or surcharged flow at SWMM becomes point sources in BreZo. Applications in Korea and California are presented.

Kim, B.; Sanders, B. F.; Kim, K.; Han, K.; Famiglietti, J. S.

2012-12-01

318

Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine  

SciTech Connect

It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following: - During the first year after the Chernobyl accident 75-93% of Commitment Effective Dose had been formed; - During the first year after the Chernobyl accident 85-90% of damage from radiation exposure had been formed. During the next 50 years (the late phase of accident) only 10-15% of damage from radiation exposure will have been formed; - Remedial actions (agricultural remedial actions as most effective) in Ukraine are intended for reduction of the damage from consumption of production which is contaminated in the late phase of accident. I.e. agricultural remedial actions have been intended for minimization only 10 % of the total damage from radiation exposure; - Medical countermeasures can minimize radiation exposure damage by an order of magnitude greater than agricultural countermeasures. - Thus, retrospection of nuclear accident has essentially changed type of remedial actions and has given a chance to increase effectiveness of spending by an order of magnitude. This example illustrates that in order to optimize remedial actions it is required to use data of retrospection of nuclear accidents in all cases when monitoring in the early and (or) intermediate phases is unsatisfactory. (author)

Georgievskiy, Vladimir [Russian Research Center 'Kurchatov Insitute', Kurchatov Sq., 1, 123182 Moscow (Russian Federation)

2007-07-01

319

Landslide risk analysis: a multi-disciplinary methodological approach  

NASA Astrophysics Data System (ADS)

This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably from 2 840 000 to 9 350 000 €, depending on the selected temporal scenario and the expected closing time of the potentially affected structures. The multi-disciplinary approach discussed in this study may assist local decision makers in determining the nature and magnitude of the expected losses due to a dangerous event, which can be anticipated in a given study area, during a specified time period. Besides, a preventive knowledge of the prospective physical effects and economic consequences may help local decision makers to choose the best prevention and mitigation options and to decide how to allocate resources properly, so that potential benefits are maximised at an acceptable cost.

Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

2007-11-01

320

WASTE-ACC: A computer model for analysis of waste management accidents  

SciTech Connect

In support of the U.S. Department of Energy`s (DOE`s) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives.

Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

1996-12-01

321

Analysis of 121 fatal passenger car-adult pedestrian accidents in China.  

PubMed

To study the characteristics of fatal vehicle-pedestrian accidents in China?a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

2014-10-01

322

UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY  

SciTech Connect

It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

2013-07-01

323

Methodology for analysis and simulation of large multidisciplinary problems  

NASA Technical Reports Server (NTRS)

The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

1989-01-01

324

A faster reactor transient analysis methodology for PCs  

SciTech Connect

The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report.

Ott, K.O. (Purdue Univ., Lafayette, IN (United States). School of Nuclear Engineering)

1991-10-01

325

Improved finite element methodology for integrated thermal structural analysis  

NASA Technical Reports Server (NTRS)

An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analyses is presented. New thermal finite elements which yield exact nodal and element temperature for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal-structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

Dechaumphai, P.; Thornton, E. A.

1982-01-01

326

Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

1997-12-01

327

Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.  

PubMed

The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. PMID:21608109

Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

2011-07-01

328

Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models  

SciTech Connect

The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

S. A. Eide; D. M. Rasmuson; C. L. Atwood

2008-09-01

329

Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.  

SciTech Connect

Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

Salay, Michael (United States Nuclear Regulatory Commission, Washington, D.C.); Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

2008-10-01

330

What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?  

PubMed

Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such as in-depth accident investigations and pre-crash data recordings. PMID:23314359

Tivesten, Emma; Wiberg, Henrik

2013-03-01

331

The effects of aircraft certification rules on general aviation accidents  

NASA Astrophysics Data System (ADS)

The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

Anderson, Carolina Lenz

332

Anthropological analysis of taekwondo--new methodological approach.  

PubMed

The aim of this research is to determine the order and importance of impacts of particular anthropological characteristics and technical and tactical competence on success in taekwondo according to opinions of top taekwondo instructors (experts). Partial objectives include analysis of metric characteristics of the measuring instrument, and determining differences between two disciplines (sparring and technical discipline of patterns) and two competition systems (WTF and ITF). In accordance with the aims, the research was conducted on a sample of respondents which consisted of 730 taekwondo instructors from 6 continents and from 69 countries (from which we selected 242 instructors), who are at different success levels in both taekwondo competition systems (styles) and two taekwondo disciplines. The respondents were divided into 3 qualitative subsamples (OST-USP-VRH) using the dependant variable of accomplished results of the instructor. In 6 languages, they electronically evaluated the impact in percentage value (%) of motor and functional skills (MOTFS), morphological characteristics (MORF), psychological profile of an athlete (PSIH), athletic intelligence (INTE) and technical and tactical competence - (TE-TA) on success in taekwondo. The analysis of metric characteristics of the constructed instrument showed a satisfactory degree of agreement (IHr) which is proportional to the level of respondent quality, i.e. it grows along with the increase in instructor quality in all analysed disciplines of both systems. Top instructors assigned the highest portion of impact on success to the motor and functional skills (MOTFS) variable: WTF-SPB=29.1, ITF-SPB=29.2, WTF-THN=35.0, ITF-THN=32.0). Statistically significant differences in opinions of instructors of different styles and disciplines were not recorded in any of the analysed variables. The only exception is the psychological profile of an athlete variable, which WTF instructors of sparring (AM=23.7%), on a significance level of p<0.01, evaluate as having a statistically significantly higher impact on success in tackwondo than WTF instructors of the technical discipline of patterns (15.4%). PMID:23914483

Cular, Drazen; Munivrana, Goran; Kati?, Ratko

2013-05-01

333

Methodology for object-oriented real-time systems analysis and design: Software engineering  

NASA Technical Reports Server (NTRS)

Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

Schoeffler, James D.

1991-01-01

334

A Comprehensive Analysis of the X-15 Flight 3-65 Accident  

NASA Technical Reports Server (NTRS)

The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

2014-01-01

335

The failure analysis of composite material flight helmets as an aid in aircraft accident investigation.  

PubMed

Understanding why a flying helmet fails to maintain its integrity during an accident can contribute to an understanding of the mechanism of injury and even of the accident itself. We performed a post-accident evaluation of failure modes in glass and aramid fibre-reinforced composite helmets. Optical and microscopic (SEM) techniques were employed to identify specific fracture mechanisms. They were correlated with the failure mode. Stress and energy levels were estimated from the damage extent. Damage could be resolved into distinct impact, flexure and compression components. Delamination was identified as a specific mode, dependent upon the matrix material and bonding between the layers. From the energy dissipated in specific fracture mechanisms we calculated the minimum total energy imparted to the helmet-head combination and the major injury vector (MIV) direction and magnitude. The level of protection provided by the helmet can also be estimated. PMID:1859350

Caine, Y G; Bain-Ungerson, O; Schochat, I; Marom, G

1991-06-01

336

From analysis\\/synthesis to conjecture\\/analysis: a review of Karl Popper's influence on design methodology in architecture  

Microsoft Academic Search

The two principal models of design in methodological circles in architecture—analysis\\/synthesis and conjecture\\/analysis—have their roots in philosophy of science, in different conceptions of scientific method. This paper explores the philosophical origins of these models and the reasons for rejecting analysis\\/synthesis in favour of conjecture\\/analysis, the latter being derived from Karl Popper's view of scientific method. I discuss a fundamental problem

Greg Bamford

2002-01-01

337

Landscape equivalency analysis: methodology for estimating spatially explicit biodiversity credits.  

PubMed

We propose a biodiversity credit system for trading endangered species habitat designed to minimize and reverse the negative effects of habitat loss and fragmentation, the leading cause of species endangerment in the United States. Given the increasing demand for land, approaches that explicitly balance economic goals against conservation goals are required. The Endangered Species Act balances these conflicts based on the cost to replace habitat. Conservation banking is a means to manage this balance, and we argue for its use to mitigate the effects of habitat fragmentation. Mitigating the effects of land development on biodiversity requires decisions that recognize regional ecological effects resulting from local economic decisions. We propose Landscape Equivalency Analysis (LEA), a landscape-scale approach similar to HEA, as an accounting system to calculate conservation banking credits so that habitat trades do not exacerbate regional ecological effects of local decisions. Credits purchased by public agencies or NGOs for purposes other than mitigating a take create a net investment in natural capital leading to habitat defragmentation. Credits calculated by LEA use metapopulation genetic theory to estimate sustainability criteria against which all trades are judged. The approach is rooted in well-accepted ecological, evolutionary, and economic theory, which helps compensate for the degree of uncertainty regarding the effects of habitat loss and fragmentation on endangered species. LEA requires application of greater scientific rigor than typically applied to endangered species management on private lands but provides an objective, conceptually sound basis for achieving the often conflicting goals of economic efficiency and long-term ecological sustainability. PMID:16132443

Bruggeman, Douglas J; Jones, Michael L; Lupi, Frank; Scribner, Kim T

2005-10-01

338

Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.  

PubMed

Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future researchers. PMID:19217180

Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

2009-08-31

339

Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II  

NASA Astrophysics Data System (ADS)

In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

Hu, G.; Zhao, S.; Ruan, K.

2012-01-01

340

A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.  

ERIC Educational Resources Information Center

A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

Cantor, Jeffrey A.

1991-01-01

341

Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis  

PubMed Central

Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942

Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

2014-01-01

342

Extending the Human-Controller Methodology in Systems-Theoretic Process Analysis (STPA)  

E-print Network

Extending the Human-Controller Methodology in Systems- Theoretic Process Analysis (STPA) by Cameron in order to derive causal factors related to human controllers within the context of the system and its: __________________________________________________________ Nancy G. Leveson Professor of Aeronautics and Astronautics and Engineering Systems Thesis Supervisor

Leveson, Nancy

343

A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining  

E-print Network

A Methodology for Empirical Analysis of Brain Connectivity through Graph Mining Jiang Bian, Josh M Informatics Brain Imaging Research Center, Psychiatric Research Institute University of Arkansas for Medical and functional brain connectivity networks and has helped researchers conceive the effects of neurological

Xie, Mengjun

344

The State of Public Management Research: An Analysis of Scope and Methodology  

Microsoft Academic Search

In this article we examine the state of public management research, specifically focusing on the scope of research and variety of methodologies pursued in the field. We use a sample of manuscripts from three successive meetings of the Public Management Research Association to explore these issues. Our analysis is organized along four themes that have been central to public management's

David W. Pitts; Sergio Fernandez

2009-01-01

345

Advanced Research in Artificial Intelligence132 METHODOLOGY FOR LANGUAGE ANALYSIS AND GENERATION  

E-print Network

ACM Classification Keywords: I.2.7 Natural Language Processing Conference: The paper is selected fromAdvanced Research in Artificial Intelligence132 METHODOLOGY FOR LANGUAGE ANALYSIS AND GENERATION: The best results in the application of computer science systems to automatic translation are obtained

Cardeñosa, Jesús

346

Statistical Test of the Rule Assessment Methodology by Latent Class Analysis  

Microsoft Academic Search

A problem of Siegler's (1981) rule assessment methodology is that the assignment of subjects to rules takes place by an arbitrary criterion. This problem can be solved by latent class analysis by which we can test statistically how many rules are needed to fit the data and which these rules are. Two data sets of the balance scale test are

Brenda R. J. Jansen

1997-01-01

347

Success story in software engineering using NIAM (Natural language Information Analysis Methodology)  

SciTech Connect

To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

Eaton, S.M.; Eaton, D.S.

1995-10-01

348

RELIABILITY ANALYSIS OF THE LHC MACHINE PROTECTION SYSTEM: TERMINOLOGY AND METHODOLOGY  

Microsoft Academic Search

The LHC Machine Protection System (MPS) ensures machine safety by performing a beam dump (or inhibiting beam injection) in case of non-nominal machine conditions, thus preventing machine damage. The trade- off between machine safety and beam availability is one of the main issues related to the LHC MPS. For a global analysis of the entire MPS, a generic methodology is

S. Wagner; ETH Zurich; J. Wenninger

349

Computational Methodologies for Transcript Analysis in the Age of Next-Generation DNA Sequencing  

E-print Network

. Furthermore, the emergence of next-generation DNA sequencing has significantly reduced costs, thereby in the Age of Next-Generation DNA Sequencing A Dissertation Presented to the Faculty of the Graduate SchoolAbstract Computational Methodologies for Transcript Analysis in the Age of Next-Generation DNA

Gerstein, Mark

350

Formal Analysis and Automatic Generation of User Interfaces: Approach, Methodology, and an Algorithm  

Microsoft Academic Search

In this paper we propose a formal approach and methodology for analysis and generation of human-machine interfaces, with special emphasis on human-automation interaction. Our approach focuses on the information content of the interface—that is, on \\

Michael Heymann; Asaf Degani

2007-01-01

351

Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.  

ERIC Educational Resources Information Center

This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

2003-01-01

352

A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)  

NASA Technical Reports Server (NTRS)

Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

Johnson, C. W.; Holloway, C, M.

2007-01-01

353

Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis  

SciTech Connect

A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

1983-07-01

354

Severity detection of traffic accidents at intersections based on vehicle motion analysis and multiphase linear regression  

Microsoft Academic Search

This paper proposes a new approach to describe traffic scene including vehicle collisions and vehicle anomalies at intersections by video processing and motion statistic techniques. The research mainly targets on extracting abnormal event characteristics at intersections and learning normal traffic flow by trajectory clustering techniques. Detecting and analyzing accident events are done by observing partial vehicle trajectories and motion characteristics.

O. Ako?z; M. E. Karsligil

2010-01-01

355

A systemic approach to accident analysis: A case study of the Stockwell shooting  

Microsoft Academic Search

This paper uses a systemic approach to accident investigation, based upon AcciMaps, to model the events leading up to the shooting of Jean Charles de Menezes at Stockwell Underground station in July 2005. The model captures many of the findings of the Independent Police Complaints Commission's report in a single representation, modelling their interdependencies and the causal flow. Furthermore, by

Daniel P. Jenkins; Paul M. Salmon; Neville A. Stanton; Guy H. Walker

2010-01-01

356

Analysis and recurrence of lightning stroke accidents for three gorges 500kV transmission lines  

Microsoft Academic Search

Lightning stoke has become the first environmental factor which influences the safety of 500-kV power grid in Three Gorges near area. To grasp the operation level of transmission lines, analyze and recur the lightning stroke accidents, and summarize the characters, are the bases of finding and guiding the effective lightning protection measures. In this paper, a method of recurrence for

Xiaolan Li; Jiahong Chen; Shanqiang Gu; Yaoheng Xie; Chun Zhao; Enze Lu

2010-01-01

357

Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.  

ERIC Educational Resources Information Center

Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

Dunwoody, Sharon; And Others

358

Statistical analysis and diagnosis methodology for RF circuits in LCP substrates  

Microsoft Academic Search

This paper presents the application of a fast and accurate layout-level statistical analysis methodology for the diagnosis of RF circuit layouts with embedded passives in liquid crystalline polymer substrates. The approach is based on layout-segmentation, lumped-element modeling, sensitivity analysis, and extraction of probability density function using convolution methods. The statistical analyses were utilized as a diagnosis tool to estimate distributed

Souvik Mukherjee; Madhavan Swaminathan; Erdem Matoglu

2005-01-01

359

Site co-link analysis applied to small networks: a new methodological approach  

Microsoft Academic Search

The method of co-link was proposed in 1996 and since then it has been applied in many Webometric studies. Its definition refers\\u000a to “page co-link analysis”, as links are provided by URLs or pages. This paper presents a new methodological approach, a “site\\u000a co-link analysis”, to investigate relations in small networks. The Oswaldo Cruz Foundation institutes were used as a

Pamela Lang; Fábio C. Gouveia; Jacqueline Leta

2010-01-01

360

Sustainable energy futures: Methodological challenges in combining scenarios and participatory multi-criteria analysis  

Microsoft Academic Search

This paper analyses the combined use of scenario building and participatory multi-criteria analysis (PMCA) in the context of renewable energy from a methodological point of view. Scenarios have been applied increasingly in decision-making about long-term consequences by projecting different possible pathways into the future. Scenario analysis accounts for a higher degree of complexity inherent in systems than the study of

Katharina Kowalski; Sigrid Stagl; Reinhard Madlener; Ines Omann

2009-01-01

361

Overview of the facility accident analysis for the U.S. Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement  

Microsoft Academic Search

An integrated risk-based approach has been developed to address the human health risks of radiological and chemical releases from potential facility accidents in support of the U.S. Department of Energy (DOE) Environmental Restoration and Waste Management (EM) Programmatic Environmental Impact Statement (PEIS). Accordingly, the facility accident analysis has been developed to allow risk-based comparisons of EM PEIS strategies for consolidating

C. Mueller; L. Habegger; D. Huizenga

1994-01-01

362

Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129  

NASA Astrophysics Data System (ADS)

Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

2014-05-01

363

Nuclear accidents  

NSDL National Science Digital Library

Accidents at nuclear power plants can be especially devastating to people and the environment. This article, part of a series about the future of energy, introduces students to nuclear accidents at Chernobyl, Three Mile Island, and Tokaimura. Students explore the incidents by examining possible causes, environmental impacts, and effects on life.

Iowa Public Television. Explore More Project

2004-01-01

364

Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications  

NASA Technical Reports Server (NTRS)

In this study involving advanced fluid flow codes, an incremental iterative formulation (also known as the delta or correction form) together with the well-known spatially-split approximate factorization algorithm, is presented for solving the very large sparse systems of linear equations which are associated with aerodynamic sensitivity analysis. For smaller 2D problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. Iterative methods are needed for larger 2D and future 3D applications, however, because direct methods require much more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioning of the coefficient matrix; this problem can be overcome when these equations are cast in the incremental form. These and other benefits are discussed. The methodology is successfully implemented and tested in 2D using an upwind, cell-centered, finite volume formulation applied to the thin-layer Navier-Stokes equations. Results are presented for two sample airfoil problems: (1) subsonic low Reynolds number laminar flow; and (2) transonic high Reynolds number turbulent flow.

Taylor, Arthur C., III; Hou, Gene W.

1993-01-01

365

A new analysis methodology for the motion of self-propelled particles and its application  

NASA Astrophysics Data System (ADS)

The self-propelled particle (SPP) on the microscale in the solution is a growing field of study, which has a potential to be used for nanomedicine and nanorobots. However, little detailed quantitative analysis on the motion of the SPP has been performed so far because its self-propelled motion is strongly coupled to Brownian motion, which makes the extraction of intrinsic propulsion mechanisms problematic, leading to inconsistent conclusions. Here, we present a novel way to decompose the motion of the SPP into self-propelled and Brownian components; accurate values for self-propulsion speed and diffusion coefficients of the SPP are obtained for the first time. Then, we apply our analysis methodology to ostensible chemotaxis of SPP, and reveal the actual (non-chemotactic) mechanism of the phenomenon, demonstrating that our analysis methodology is a powerful and reliable tool.

Byun, Young-Moo; Lammert, Paul; Crespi, Vincent

2011-03-01

366

An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident  

SciTech Connect

An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

El-Genk, M.S.; Paramonov, D. (Institute for Space Nuclear Power Studies, Department of Chemical and Nuclear Engineering, The University of New Mexico, Albuquerque, New Mexico 87131 (United States))

1993-01-10

367

The Analysis of a Friendly Fire Accident using a Systems Model of Accidents* N.G. Leveson, Ph.D.; Massachusetts Institute of Technology; Cambridge, Massachusetts  

E-print Network

assisting Kurdish refugees and to provide a safe haven for the resettlement of the refugees. In addition as a multinational humanitarian effort to relieve the suffering of hundreds of thousands of Kurdish refugees who fled and experienced. After two years and hundreds of hours of extensive investigation by accident boards, autonomous

Leveson, Nancy

368

SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS  

SciTech Connect

The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

Brad J. Merrill; Shannon M Bragg-Sitton

2013-09-01

369

Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II  

NASA Astrophysics Data System (ADS)

The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

2014-03-01

370

Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report  

SciTech Connect

This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

1997-06-01

371

Evaluation of a Progressive Failure Analysis Methodology for Laminated Composite Structures  

NASA Technical Reports Server (NTRS)

A progressive failure analysis methodology has been developed for predicting the nonlinear response and failure of laminated composite structures. The progressive failure analysis uses C plate and shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms. The progressive failure analysis model is implemented into a general purpose finite element code and can predict the damage and response of laminated composite structures from initial loading to final failure.

Sleight, David W.; Knight, Norman F., Jr.; Wang, John T.

1997-01-01

372

Methodological considerations for the harmonization of non-cholesterol sterol bio-analysis.  

PubMed

Non-cholesterol sterols (NCS) are used as surrogate markers of cholesterol metabolism which can be measured from a single blood sample. Cholesterol precursors are used as markers of endogenous cholesterol synthesis and plant sterols are used as markers of cholesterol absorption. However, most aspects of NCS analysis show wide variability among researchers within the area of biomedical research. This variability in methodology is a significant contributor to variation between reported NCS values and hampers the confidence in comparing NCS values across different research groups, as well as the ability to conduct meta-analyses. This paper summarizes the considerations and conclusions of a workshop where academic and industrial experts met to discuss NCS measurement. Highlighted is why each step in the analysis of NCS merits critical consideration, with the hopes of moving toward more standardized and comparable NCS analysis methodologies. Alkaline hydrolysis and liquid-liquid extraction of NCS followed by parallel detection on GC-FID and GC-MS is proposed as an ideal methodology for the bio-analysis of NCS. Furthermore the importance of cross-comparison or round robin testing between various groups who measure NCS is critical to the standardization of NCS measurement. PMID:24674990

Mackay, Dylan S; Jones, Peter J H; Myrie, Semone B; Plat, Jogchum; Lütjohann, Dieter

2014-04-15

373

Verification of the three-dimensional thermal-hydraulic models of the TRAC accident-analysis code. [PWR  

SciTech Connect

The Transient Reactor Analysis Code (TRAC) being developed at Los Alamos National Laboratory provides a best-estimate prediction of the response of light water reactors or test facilities to postulated accident sequences. One of the features of the code is the ability to analyze the vessel and its heated core in three dimensions. The code is being used to analyze the results of tests in a large-scale reflood test facility built in Japan, known as the Cylindrical Core Test Facility (CCTF). Two test runs have been analyzed that are useful for verification of the three-dimensional analysis capability of the TRAC code. One test began with an initial temperature skew across the heated core. The second test had a large radial power skew between the central and peripheral assemblies. The good agreement between the calculation and the experiment for both of these experiments demonstrates the three-dimensional analysis capability of the TRAC code.

Motley, F.

1982-01-01

374

Methodology for the characterization of water quality: Analysis of time-dependent variability  

NASA Astrophysics Data System (ADS)

The general methodology for characterization of water quality here presented was applied, after elimination of spatial effects, to the analysis of time-dependent variability of physico-chemical parameters measured, on eighteen dates, during the summer months of 1976, at 112 sampling stations on the Saint Lawrence River between Cornwall and Quebec City. Two aspects of water utilization are considered: domestic water-supply and capacity to sustain balanced aquatic life. The methodology, based on use and adaptation of classical multivariate statistical methods (correspondence analysis, hierarchical classification), leads, for a given type of water utilization, to the determination of the most important parameters, of their essential interrelations and shows the relative importance of their variations. Rationalization of network operations is thus obtained through identification of homogeneous behaviour periods as well as of critical dates for the measurement of parameters characterizing a given use.

Lachance, Marius; Bobée, Bernard

1982-11-01

375

Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis  

PubMed Central

Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

2010-01-01

376

Accurate statistical process variation analysis for 0.25-?m CMOS with advanced TCAD methodology  

Microsoft Academic Search

Effects of statistical process variation on the 0.25-?m CMOS performance have been accurately characterized by using a new calibrated TCAD methodology. To conduct the variation analysis, a series of TCAD simulations was conducted on the basis of DoE (design of experiments) with optimum variable transformations, which resulted in RSF's (response surface functions) for threshold voltage (Vth) and saturation drain current

Hisako Sato; Hisaaki Kunitomo; Katsumi Tsuneno; Kazutaka Mori; Hiroo Masuda

1998-01-01

377

The application of a cognitive mapping and user analysis methodology to neighborhood park service area  

E-print Network

of the requirements for the degree of MASTER OF SCIENCE August 1985 Maj or Subject: Recreation Resources Development THE APPLICATION OF A COGNITIVE MAPPING ANO USER ANALYSIS METHODOLOGY TO NEIGHBORHOOD PARK SERVICE AREA DIFFERENTIATION A Thesis by LAWRENCE... were buildings with sixty-three (63) total responses, open space and natural areas with twenty-six (26) responses, and roads with fifteen (15) responses. The highly developed nature of the study area, along with the presence of churches, schools...

Mutter, Lawrence Reed

1985-01-01

378

Novel data-mining methodologies for adverse drug event discovery and analysis.  

PubMed

An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis. PMID:22549283

Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

2012-06-01

379

A multi-scale segmentation\\/object relationship modelling methodology for landscape analysis  

Microsoft Academic Search

Natural complexity can best be explored using spatial analysis tools based on concepts of landscape as process continuums that can be partially decomposed into objects or patches. We introduce a five-step methodology based on multi-scale segmentation and object relationship modelling. Hierarchical patch dynamics (HPD) is adopted as the theoretical framework to address issues of heterogeneity, scale, connectivity and quasi-equilibriums in

C. Burnett; Thomas Blaschke

2003-01-01

380

Source term and radiological consequences of the Chernobyl accident  

SciTech Connect

The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

Mourad, R.; Snell, V.

1987-01-01

381

Methodology for the analysis of pollutant emissions from a city bus  

NASA Astrophysics Data System (ADS)

In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

Armas, Octavio; Lapuerta, Magín; Mata, Carmen

2012-04-01

382

Analysis of Japanese Radionuclide Monitoring Data of Food Before and After the Fukushima Nuclear Accident  

PubMed Central

In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima 137Cs and 90Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in 137Cs and vegetarian produce was usually higher in 90Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of 90Sr being 10% of the respective 137Cs concentrations may soon be at risk, as the 90Sr/137Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the 90Sr content of Japanese foods. PMID:25621976

2015-01-01

383

Analysis of Japanese radionuclide monitoring data of food before and after the fukushima nuclear accident.  

PubMed

In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima (137)Cs and (90)Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in (137)Cs and vegetarian produce was usually higher in (90)Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of (90)Sr being 10% of the respective (137)Cs concentrations may soon be at risk, as the (90)Sr/(137)Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the (90)Sr content of Japanese foods. PMID:25621976

Merz, Stefan; Shozugawa, Katsumi; Steinhauser, Georg

2015-03-01

384

Spatial correlation analysis of isotropic microvessels: methodology and application to thyroid capillaries.  

PubMed

The study of relations between structural organization and functions of microcirculatory networks is a major aim of modern microangiology. Such a structural aspect of microvessels (MVs) as their spatial arrangement has substantial influence on their transport and other functional properties. This paper describes a methodology of spatial correlation analysis for isotropic blood and lymphatic MVs which is based on a stereological estimator of the pair correlation function [g3D(r)] created recently by the authors for systems of elongated objects. The following main features of the methodology are presented: (i) interpretation of the shape of g3D(r) curves, (ii) their quantitative description by numerical parameters, and (iii) limitations of the method arising from statistical requirements to MVs under investigation. The methodology is considered in the light of multilevel sampling designs, which are typical for biomedical morphology. The estimator with its methodological framework is applied to perifollicular blood capillaries in the adult rat thyroid. Related methods for studying the spatial arrangement of MVs are thoroughly discussed in the paper. PMID:16598655

Krasnoperov, Renat A; Stoyan, Dietrich

2006-05-01

385

Nuclear accidents.  

PubMed

A nuclear accident with radioactive contamination can happen anywhere in the world. Because expert nuclear emergency teams may take several hours to arrive at the scene, local authorities must have a plan of action for the hours immediately following an accident. The site should be left untouched except to remove casualties. Treatment of victims includes decontamination and meticulous wound debridement. Acute radiation syndrome may be an overwhelming sequela. PMID:7072574

Mobley, J A

1982-05-01

386

Distinguishing neglect from abuse and accident: analysis of the case files of a hospital child protection team in Israel.  

PubMed

The study compares the characteristics of children assessed as neglected, physically abused, or accident victims by a hospital child protection team (CPT) and identifies the information on which the CPT based its assessments. The comparison is based on content analysis of records of 414 children examined by the CPT in a major hospital in Israel between 1991 and 2006, of whom 130 (31.4%) were neglected, 54 (13.0%) were physically abused, and 230 (55.6%) were accident victims. Findings of three hierarchical logistic regressions show that the children classified as neglected had the most early development problems, but were the least likely to have received psychological treatment, and that that their families were the most likely to be receiving state financial support and to have had prior contact with the social services. They also show that the CPT had received the least information indicative of maltreatment about these children from the community and that their medical and physical examinations aroused the least suspicion. Finally, they show that the impressions the hospital staff and CPT had of the parents during the hospital visit had greater power to distinguish between the groups than the children's characteristics or the parents' socio-demographic background. The findings attest to the ability of the CPT to differentiate between neglect victims and physical abuse or accident victims. With this, they also point to ambiguities in the classification process that should be addressed by further research and training and to the need for detailed and thorough documentation of the information and observations on which the CPT's assessments are based. PMID:20561078

Davidson-Arad, Bilha; Benbenishty, Rami; Chen, Wendy; Glasser, Saralee; Zur, Shmuel; Lerner-Geva, Liat

2010-11-01

387

Socio-economic Value Analysis in Geospatial and Earth Observation: A methodology review (Invited)  

NASA Astrophysics Data System (ADS)

Many industries have long since realised that applying macro-economic analysis methodologies to assess the socio-economic value of a programme is a critical step to convincing decision makers to authorise investment. The geospatial and earth observation industry has however been slow to embrace economic analysis. There are however a growing number of studies, published in the last few years, that have applied economic principles to this domain. They have adopted a variety of different approaches, including: - Computable General Equilibrium Modelling (CGE) - Revealed preference, stated preference (Willingness to Pay surveys) - Partial Analysis - Simulations - Cost-benefit analysis (with and without risk analysis) This paper will critically review these approaches and assess their applicability to different situations and to meet multiple objectives.

Coote, A. M.; Bernknopf, R.; Smart, A.

2013-12-01

388

Inferring Functional Neural Connectivity with Phase Synchronization Analysis: A Review of Methodology  

PubMed Central

Functional neural connectivity is drawing increasing attention in neuroscience research. To infer functional connectivity from observed neural signals, various methods have been proposed. Among them, phase synchronization analysis is an important and effective one which examines the relationship of instantaneous phase between neural signals but neglecting the influence of their amplitudes. In this paper, we review the advances in methodologies of phase synchronization analysis. In particular, we discuss the definitions of instantaneous phase, the indexes of phase synchronization and their significance test, the issues that may affect the detection of phase synchronization and the extensions of phase synchronization analysis. In practice, phase synchronization analysis may be affected by observational noise, insufficient samples of the signals, volume conduction, and reference in recording neural signals. We make comments and suggestions on these issues so as to better apply phase synchronization analysis to inferring functional connectivity from neural signals. PMID:22577470

Sun, Junfeng; Li, Zhijun; Tong, Shanbao

2012-01-01

389

SAS4A analysis of unprotected loss of flow accidents in a metal fuel reactor  

SciTech Connect

This paper discusses the SAS4A code system which is used to analyze the core behavior under beyond-design-base transient conditions for various Advanced Liquid Metal Reactor (ALMR) designs. The results of these analyses provide help in assessing the outcomes of various accident sequences, and provide guidance for future experimental needs and mathematical model development. This paper describes the thermal-hydraulic and neutronic events that occur in a low void worth metal fuel core [2] during a very rapid unprotected Loss of Flow (LOF) accident, with a flow decay half time t{sub 1/2} = 0.3s. This LOF was selected because it leads to fuel pin failure and subsequent fuel relocation. The only mechanistic initiator that can lead to such a rapid LOF is, possibly, a severe earthquake. For slower LOFs pin failure and fuel relocation do not occur, as negative reactivity from other core feedback effects has enough time to counteract the positive reactivity introduced by the early sodium boiling.

Tentner, A.M.

1992-12-01

390

SAS4A analysis of unprotected loss of flow accidents in a metal fuel reactor  

SciTech Connect

This paper discusses the SAS4A code system which is used to analyze the core behavior under beyond-design-base transient conditions for various Advanced Liquid Metal Reactor (ALMR) designs. The results of these analyses provide help in assessing the outcomes of various accident sequences, and provide guidance for future experimental needs and mathematical model development. This paper describes the thermal-hydraulic and neutronic events that occur in a low void worth metal fuel core [2] during a very rapid unprotected Loss of Flow (LOF) accident, with a flow decay half time t[sub 1/2] = 0.3s. This LOF was selected because it leads to fuel pin failure and subsequent fuel relocation. The only mechanistic initiator that can lead to such a rapid LOF is, possibly, a severe earthquake. For slower LOFs pin failure and fuel relocation do not occur, as negative reactivity from other core feedback effects has enough time to counteract the positive reactivity introduced by the early sodium boiling.

Tentner, A.M.

1992-01-01

391

LeRoy Meisinger, Part II: Analysis of the Scientific Ballooning Accident of 2 June 1924.  

NASA Astrophysics Data System (ADS)

During the spring of 1924, U.S. Weather Bureau meteorologist LeRoy Meisinger conducted a series of experiments with a free balloon to determine the trajectories of air around extratropical cyclones. The 10th flight in the series ended with a crash of the balloon over central Illinois. Both Meisinger and the pilot, Army Air Services Lt. James Neely, were killed.An effort has been made to reconstruct this accident using information from a review article by early twentieth-century meteorologist Vincent Jakl and newspaper accounts of the accident. The principal results of the study follow.1) Meisinger's balloon was caught in the downdraft of a newly developed thunderstorm over the Bement, Illinois, area on the evening of 2 June;2) a hard landing took place in a cornfield just north of Bement, and loss of ballast at the hard-landing site was sufficient to cause the balloon to rise again; and3) after rebounding from the ground, the balloon with the two aeronauts aboard was struck by lightning. A fire resulted that burned through the netting and led to a crash four miles northeast of the hard-landing site.

Lewis, John M.; Moore, Charles B.

1995-02-01

392

Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application  

PubMed Central

Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

2010-01-01

393

Soil moisture retrieval from multi-instrument observations: Information content analysis and retrieval methodology  

NASA Astrophysics Data System (ADS)

algorithm has been developed that employs neural network technology to retrieve soil moisture from multi-wavelength satellite observations (active/passive microwave, infrared, and visible). This represents the first step in the development of a methodology aiming to combine beneficial aspects of existing retrieval schemes. Several quality metrics have been developed to assess the performance of a retrieval product on different spatial and temporal scales. Additionally, an innovative approach to estimate the retrieval uncertainty has been proposed. An information content analysis of different satellite observations showed that active microwave observations are best suited to capture the soil moisture temporal variability, while the amplitude of the surface temperature diurnal cycle is best suited to capture the spatial variability. In a synergy analysis, it has been found that through the combination of all observations the retrieval uncertainty could be reduced by 13%. Furthermore, it was found that synergy benefits are significantly larger using a data fusion approach compared to an a posteriori combination of retrieval products, supporting the combination of different retrieval methodology aspects in a single algorithm. In a comparison with model data, it was found that the proposed methodology also shows potential to be used for the evaluation of modeled soil moisture. A comparison with in situ observations showed that the algorithm is well able to capture soil moisture spatial variabilities. It was concluded that the temporal performance can be improved through incorporation of other existing retrieval approaches.

Kolassa, J.; Aires, F.; Polcher, J.; Prigent, C.; Jimenez, C.; Pereira, J. M.

2013-05-01

394

Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2  

SciTech Connect

This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

Evans, J.S. [Harvard School of Public Health, Boston, MA (United States); Abrahmson, S. [Wisconsin Univ., Madison, WI (United States); Bender, M.A. [Brookhaven National Lab., Upton, NY (United States); Boecker, B.B.; Scott, B.R. [Inhalation Toxicology Research Inst., Albuquerque, NM (United States); Gilbert, E.S. [Battelle Pacific Northwest Lab., Richland, WA (United States)

1993-10-01

395

The Nuclear Organization and Management Analysis Concept methodology: Four years later  

SciTech Connect

The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

1992-08-01

396

The Nuclear Organization and Management Analysis Concept methodology: Four years later  

SciTech Connect

The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

1992-01-01

397

Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.  

PubMed

The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition). PMID:24760596

Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

2014-09-01

398

Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology  

NASA Technical Reports Server (NTRS)

A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

1974-01-01

399

Methodology for CFD Design Analysis of National Launch System Nozzle Manifold  

NASA Technical Reports Server (NTRS)

The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

Haire, Scot L.

1993-01-01

400

Analysis of risk reduction methods for interfacing system LOCAs (loss-of-coolant accidents) at PWRs  

SciTech Connect

The Reactor Safety Study (WASH-1400) predicted that Interfacing System Loss-of-Coolant Accidents (ISL) events were significant contributors to risk even though they were calculated to be relatively low frequency events. However, there are substantial uncertainties involved in determining the probability and consequences of the ISL sequences. For example, the assumed valve failure modes, common cause contributions and the location of the break/leak are all uncertain and can significantly influence the predicted risk from ISL events. In order to provide more realistic estimates for the core damage frequencies (CDFs) and a reduction in the magnitude of the uncertainties, a reexamination of ISL scenarios at PWRs has been performed by Brookhaven National Laboratory. The objective of this study was to investigate the vulnerability of pressurized water reactor designs to ISLs and identify any improvements that could significantly reduce the frequency/risk of these events.

Bozoki, G.; Kohut, P.; Fitzpatrick, R.

1988-01-01

401

Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices  

SciTech Connect

This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

1997-06-01

402

Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G  

SciTech Connect

The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

1995-01-01

403

An Analysis Methodology for the Gamma-ray Large Area Space Telescope  

NASA Technical Reports Server (NTRS)

The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.

Morris, Robin D.; Cohen-Tanugi, Johann

2004-01-01

404

Combustor design and analysis using the Rocket Combustor Interactive Design (ROCCID) methodology  

NASA Technical Reports Server (NTRS)

The ROCket Combustor Interactive Design (ROCCID) Methodology is a newly developed, interactive computer code for the design and analysis of a liquid propellant rocket combustion chamber. The application of ROCCID to design a liquid rocket combustion chamber is illustrated. Designs for a 50,000 lbf thrust and 1250 psi chamber pressure combustor using liquid oxygen (LOX)RP-1 propellants are developed and evaluated. Tradeoffs between key design parameters affecting combustor performance and stability are examined. Predicted performance and combustion stability margin for these designs are provided as a function of the combustor operating mixture ratio and chamber pressure.

Klem, Mark D.; Pieper, Jerry L.; Walker, Richard E.

1990-01-01

405

Uncertainty analysis for a PWR loss-of-coolant accident. I. Blowdown phase employing the RELAP4\\/MOD6 computer code  

Microsoft Academic Search

The feasibility of performing an uncertainty analysis of a reactor accident by using a large computer code and a comparatively small number of calculations is demonstrated. With fewer than 200 blowdown runs, 21 variables are investigated for their impact on peak clad temperature (PCT) models. Seven of the 21 input variables dominate in predicting PCT and, of these, the 3

G. P. Steck; M. Berman; R. K. Byers

1980-01-01

406

The use of current risk analysis tools evaluated towards preventing external domino accidents  

Microsoft Academic Search

Risk analysis is an essential tool for company safety policy. Risk analysis consists of identifying and evaluating all possible risks. The efficiency of risk analysis tools depends on the rigueur of identifying and evaluating all possible risks. The diversity in risk analysis procedures is such that there are many appropriate techniques for any circumstance and the choice has become more

G. L. L. Reniers; W. Dullaert; B. J. M. Ale

2005-01-01

407

Methodology issues concerning the accuracy of kinematic data collection and analysis using the ariel performance analysis system  

NASA Technical Reports Server (NTRS)

Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.

Wilmington, R. P.; Klute, Glenn K. (editor); Carroll, Amy E. (editor); Stuart, Mark A. (editor); Poliner, Jeff (editor); Rajulu, Sudhakar (editor); Stanush, Julie (editor)

1992-01-01

408

Episode analysis of deposition of radiocesium from the Fukushima Daiichi nuclear power plant accident.  

PubMed

Chemical transport models played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. However, model results could not be sufficiently evaluated because of limited observational data. We assess the model performance to simulate the deposition patterns of radiocesium ((137)Cs) by making use of airborne monitoring survey data for the first time. We conducted ten sensitivity simulations to evaluate the atmospheric model uncertainties associated with key model settings including emission data and wet deposition modules. We found that simulation using emissions estimated with a regional-scale (? 500 km) model better reproduced the observed (137)Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (? 50 km) or global-scale models. In addition, simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed (137)Cs deposition rates in high-deposition areas (? 10 kBq m(-2)) within 1 order of magnitude and showed that deposition of radiocesium over land occurred predominantly during 15-16, 20-23, and 30-31 March 2011. PMID:23391028

Morino, Yu; Ohara, Toshimasa; Watanabe, Mirai; Hayashi, Seiji; Nishizawa, Masato

2013-03-01

409

Health effects models for nuclear power plant accident consequence analysis: Low LET radiation  

SciTech Connect

This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

Evans, J.S. (Harvard Univ., Boston, MA (USA). School of Public Health)

1990-01-01

410

Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor  

NASA Astrophysics Data System (ADS)

The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.

Mkhabela, Peter Tshepo

411

Cardiovascular risk analysis by means of pulse morphology and clustering methodologies.  

PubMed

The purpose of this study was the development of a clustering methodology to deal with arterial pressure waveform (APW) parameters to be used in the cardiovascular risk assessment. One hundred sixteen subjects were monitored and divided into two groups. The first one (23 hypertensive subjects) was analyzed using APW and biochemical parameters, while the remaining 93 healthy subjects were only evaluated through APW parameters. The expectation maximization (EM) and k-means algorithms were used in the cluster analysis, and the risk scores (the Framingham Risk Score (FRS), the Systematic COronary Risk Evaluation (SCORE) project, the Assessing cardiovascular risk using Scottish Intercollegiate Guidelines Network (ASSIGN) and the PROspective Cardiovascular Münster (PROCAM)), commonly used in clinical practice were selected to the cluster risk validation. The result from the clustering risk analysis showed a very significant correlation with ASSIGN (r=0.582, p<0.01) and a significant correlation with FRS (r=0.458, p<0.05). The results from the comparison of both groups also allowed to identify the cluster with higher cardiovascular risk in the healthy group. These results give new insights to explore this methodology in future scoring trials. PMID:25023535

Almeida, Vânia G; Borba, J; Pereira, H Catarina; Pereira, Tânia; Correia, Carlos; Pêgo, Mariano; Cardoso, João

2014-11-01

412

Causality analysis in business performance measurement system using system dynamics methodology  

NASA Astrophysics Data System (ADS)

One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

2014-07-01

413

Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008  

SciTech Connect

The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAE by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.

Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

2008-09-30

414

Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts  

NASA Technical Reports Server (NTRS)

Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are presented to show how surface wrinkle progress with increasing tension loads. Antenna reflector surface accuracies were found to be very much dependent on the type and size of the antenna, the reflector surface curvature, reflector membrane supports in terms of spacing of catenaries, as well as the amount of applied load.

Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

2007-01-01

415

Systems Approaches to Animal Disease Surveillance and Resource Allocation: Methodological Frameworks for Behavioral Analysis  

PubMed Central

While demands for animal disease surveillance systems are growing, there has been little applied research that has examined the interactions between resource allocation, cost-effectiveness, and behavioral considerations of actors throughout the livestock supply chain in a surveillance system context. These interactions are important as feedbacks between surveillance decisions and disease evolution may be modulated by their contextual drivers, influencing the cost-effectiveness of a given surveillance system. This paper identifies a number of key behavioral aspects involved in animal health surveillance systems and reviews some novel methodologies for their analysis. A generic framework for analysis is discussed, with exemplar results provided to demonstrate the utility of such an approach in guiding better disease control and surveillance decisions. PMID:24348922

Rich, Karl M.; Denwood, Matthew J.; Stott, Alistair W.; Mellor, Dominic J.; Reid, Stuart W. J.; Gunn, George J.

2013-01-01

416

Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans  

PubMed Central

Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

2011-01-01

417

Assessment of the nursing skill mix in Mozambique using a task analysis methodology  

PubMed Central

Background The density of the nursing and maternal child health nursing workforce in Mozambique (0.32/1000) is well below the WHO minimum standard of 1 nurse per 1000. Two levels of education were being offered for both nurses and maternal child health nurses, in programmes ranging from 18 to 30 months in length. The health care workforce in Mozambique also includes Medical Technicians and Medical Agents, who are also educated at either basic or mid-level. The Ministry of Health determined the need to document the tasks that each of the six cadres was performing within various health facilities to identify gaps, and duplications, in order to identify strategies for streamlining workforce production, while retaining highest educational and competency standards. The methodology of task analysis (TA) was used to achieve this objective. This article provides information about the TA methodology, and selected outcomes of the very broad study. Methods A cross-sectional descriptive task analysis survey was conducted over a 15 month period (2008–2009). A stratified sample of 1295 individuals was recruited from every type of health facility in all of Mozambique’s 10 provinces and in Maputo City. Respondents indicated how frequently they performed any of 233 patient care tasks. Data analysis focused on identifying areas where identical tasks were performed by the various cadres. Analyses addressed frequency of performance, grouped by level of educational preparation, within various types of health facilities. Results Task sharing ranged from 74% to 88% between basic and general nurse cadres and from 54% to 88% between maternal and child health nurse cadres, within various health facility types. Conversely, there was distinction between scope of practice for nursing and maternal/child health nursing cadres. Conclusion The educational pathways to general nursing and maternal/child health nursing careers were consolidated into one 24 month programme for each career. The scopes of practice were affirmed based on task analysis survey data. PMID:24460789

2014-01-01

418

Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks  

NASA Technical Reports Server (NTRS)

Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.

Brown, Richard Lee

2008-01-01

419

Aerosol Monitoring and Early Detection of Pre-Accident and Accident Situations  

Microsoft Academic Search

The possibility of using the instrumentation methods and methodological approachs, developed for clean-room technology, in systems for early detection of accidents and for diagnostics of pre-accident situations by monitoring the aerosol composition of room air is analyzed. The results of experimental investigations of the dynamics of the variation of the number density and dispersion composition of aerosol particles, produced during

P. A. Aleksandrov; V. I. Kalechits; O. Yu. Maslakov

2000-01-01

420

Analysis of a 4-inch small-break loss-of-coolant accident in a Westinghouse Pressurized Water Reactor using TRAC-PF1/MOD1  

E-print Network

Pump Reactor Coolant System REference Safety Analysis Report Residual Heat Removal Safety Injection Small Break Loss of Coolant Accident SRV STP TSV Safety Relief Valve South Texas Project Transient Reactor Analysis Code Turbine Stop Valve... Loop ABC steam hne Loops D & ABC PORVs Loops D & ABC SRVs Loops D & ABC PORV pressure boundary Loops D & ABC SRV pressure boundary Loops D & ABC MSIVs Loops D & ABC TSVs Loops D & ABC TSV pressure boundary Loop D steam generator Loop ABC steam...

Knippel, Kimberley I.R.

1988-01-01

421

An analysis of accident experience at entrance ramps within construction work zones at long-term freeway reconstruction projects in Texas  

E-print Network

AN ANALYSIS OF ACCIDENT EXPERIENCE AT ENTRANCE RAMPS WITHIN CONSTRUCTION WORK ZONES AT LONG-TERM FREEWAY RECONSTRUCTION PROJECTS IN TEXAS A Thesis by DAVID BRYAN CASTEEL Submitted to the Office of Graduate Studies of Texas A&M University... IN TEXAS A Thesis by DAVID BRYAN CASTEEL Approved as to style and content by: Conrad L. Dudek (Chair of Committee) Raymond A. Krammes (Member) Olga J. Pendleton (Member) James T. P. Yao (Head of Department) August 1991 ABSTRACT An Analysis...

Casteel, David Bryan

1991-01-01

422

A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium  

SciTech Connect

The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity analysis. These steps are described below.

NONE

1999-08-31

423

Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models  

SciTech Connect

Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

Evans, J.S.; Moeller, D.W.; Cooper, D.W.

1985-07-01

424

Input manual for the Army Unit Resiliency Analysis (AURA) methodology: 1988 update. Final report, 1984March 1988  

Microsoft Academic Search

Since its inception in 1978, the Army Unit Resilience Analysis (AURA) methodology has been applied to broad spectrum of unit-level survivability\\/sustainability problems by an increasing number of analysts in the U.S. and abroad. The methodology has continued to grow to respond to special needs that have arisen in the many applications which resulted in the publication of an AURA user's

Klopcic

1988-01-01

425

The accidental risk assessment methodology for industries (ARAMIS)/layer of protection analysis (LOPA) methodology: a step forward towards convergent practices in risk assessment?  

PubMed

In the last ten years, layer of protection analysis (LOPA) emerged as a simplified form of quantitative risk assessment (QRA). The European Commission funded project Accidental Risk Assessment Methodology for Industries in the context of the Seveso 2 Directive (ARAMIS) has recently been completed. ARAMIS has several modules which give a consistent simplified approach to risk assessment which does not approach the complexity or expense of full QRA. LOPA is potentially a means of carrying out the assessment of barriers required in ARAMIS. This paper attempts to explain the principles of LOPA and the means by which it can be used within ARAMIS. PMID:16139426

Gowland, Richard

2006-03-31

426

The U-tube sampling methodology and real-time analysis of geofluids  

SciTech Connect

The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood [1973], provides minimally contaminated aliquots of