Sample records for accident analysis methodology

  1. Accident characterization methodology

    SciTech Connect

    Camp, A.L.; Harper, F.T.

    1986-01-01

    The Nuclear Regulatory Commission (NRC) is preparing NUREG-1150 to examine the risk from a selected group of nuclear power plants. NUREG-1150 will provide technical bases for comparison of NRC research to industry results and resolution of numerous severe accident issues. In support of NUREG-1150, Sandia National Laboratories has directed the production of Level 3 Probabilistic Risk Assessments (PRAs) for the Surry, Sequoyah, Peach Bottom, and Grand Gulf nuclear power plants. The Accident Sequence Evaluation Program (ASEP) at Sandia has been responsible for the Level 1 portion of the analyses, which includes estimation of core damage frequency and characterization of the dominant sequences. The ASEP analyses are being documented in NUREG/CR-4550. The purpose of this paper is to briefly describe and evaluate the methodology utilized in these analyses. The methodology will soon be published in more detail as Reference 5. The results produced for NUREG/CR-4550 using this methodology are summarized in another paper to be presented at this conference.

  2. A DOE-STD-3009 hazard and accident analysis methodology for non-reactor nuclear facilities

    SciTech Connect

    MAHN,JEFFREY A.; WALKER,SHARON ANN

    2000-03-23

    This paper demonstrates the use of appropriate consequence evaluation criteria in conjunction with generic likelihood of occurrence data to produce consistent hazard analysis results for nonreactor nuclear facility Safety Analysis Reports (SAR). An additional objective is to demonstrate the use of generic likelihood of occurrence data as a means for deriving defendable accident sequence frequencies, thereby enabling the screening of potentially incredible events (<10{sup {minus}6} per year) from the design basis accident envelope. Generic likelihood of occurrence data has been used successfully in performing SAR hazard and accident analyses for two nonreactor nuclear facilities at Sandia National Laboratories. DOE-STD-3009-94 addresses and even encourages use of a qualitative binning technique for deriving and ranking nonreactor nuclear facility risks. However, qualitative techniques invariably lead to reviewer requests for more details associated with consequence or likelihood of occurrence bin assignments in the test of the SAR. Hazard analysis data displayed in simple worksheet format generally elicits questions about not only the assumptions behind the data, but also the quantitative bases for the assumptions themselves (engineering judgment may not be considered sufficient by some reviewers). This is especially true where the criteria for qualitative binning of likelihood of occurrence involves numerical ranges. Oftentimes reviewers want to see calculations or at least a discussion of event frequencies or failure probabilities to support likelihood of occurrence bin assignments. This may become a significant point of contention for events that have been binned as incredible. This paper will show how the use of readily available generic data can avoid many of the reviewer questions that will inevitably arise from strictly qualitative analyses, while not significantly increasing the overall burden on the analyst.

  3. Accident selection methodology for TA-55 FSAR

    SciTech Connect

    Letellier, B.C.; Pan, P.Y.; Sasser, M.K. [and others

    1995-07-01

    In the past, the selection of representative accidents for refined analysis from the numerous scenarios identified in hazards analyses (HAs) has involved significant judgment and has been difficult to defend. As part of upgrading the Final Safety Analysis Report (FSAR) for the TA-55 plutonium facility at the Los Alamos National Laboratory, an accident selection process was developed that is mostly mechanical and reproducible in nature and fulfills the requirements of the Department of Energy (DOE) Standard 3009 and DOE Order 5480.23. Among the objectives specified by this guidance are the requirements that accident screening (1) consider accidents during normal and abnormal operating conditions, (2) consider both design basis and beyond design basis accidents, (3) characterize accidents by category (operational, natural phenomena, etc.) and by type (spill, explosion, fire, etc.), and (4) identify accidents that bound all foreseeable accident types. The accident selection process described here in the context of the TA-55 FSAR is applicable to all types of DOE facilities.

  4. A Methodology for Probabilistic Accident Management

    SciTech Connect

    Munteanu, Ion; Aldemir, Tunc [Ohio State University (United States)

    2003-10-15

    While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated.

  5. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    Microsoft Academic Search

    C. Mueller; J. Roglans-Ribas; S. Folga; A. Huttenga; R. Jackson; W. TenBrook; J. Russell

    1994-01-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the

  6. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a pressurized water reactor. The specific plant analyzed is the Zion Nuclear Power Plant, which is a Westinghouse-designed system that has been decommissioned.

  7. Reactivity Insertion Accident Analysis with Coupled RETRAN

    SciTech Connect

    Kim, Yo-Han; Yang, Chang-Keun; Sung, Chang-Kyung; Lee, Chang-Sup [Korea Electric Power Research Institute, 103-16 Munji-dong Yusung-ku, Taejon (Korea, Republic of)

    2004-07-01

    Because of analysis field to be wider and complicated is required, it faces up hardships due to the narrow analysis scopes and limited functions of current vendor code systems. To overcome it KEPRI develop the in-house safety analysis methodology based on the available exquisite codes. For the development, the RETRAN code was modified and coupled to compensate for the lack of capabilities. To estimate the feasibility of the methodology and code system, some reactivity insertion accidents were analyzed and compared with the results mentioned in final safety analysis report of the plant. (authors)

  8. Quantification and uncertainty analysis of source terms for severe accidents in light water reactors (QUASAR). Part 1: methodology and program plan. Volume 1

    SciTech Connect

    Park, C.; Khatib-Rahbar, M.

    1986-06-01

    The methodological framework and program plan for systematic quantification and propagation of uncertainties in radiological source terms for light water reactors are presented. The QUASAR methodology is based on detailed sensitivity analysis of the Source Term Code Package (STCP), followed by a systematic uncertainty analysis on the most sensitive parameters/variables, and phenomenological issues related to prediction of radiological releases to the environment.

  9. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    SciTech Connect

    Brereton, S.; Shinn, J. [Lawrence Livermore National Lab., CA (United States); Hesse, D [Battelle Columbus Labs., OH (United States); Kaninich, D. [Westinghouse Savannah River Co., Aiken, SC (United States); Lazaro, M. [Argonne National Lab., IL (United States); Mubayi, V. [Brookhaven National Lab., Upton, NY (United States)

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  10. Nuclear fuel cycle facility accident analysis handbook

    Microsoft Academic Search

    J. E. Ayer; A. T. Clark; P. Loysen; M. Y. Ballinger; J. Mishima; P. C. Owczarski; W. S. Gregory; B. D. Nichols

    1988-01-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage\\/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to

  11. Analysis of accidents during flashing operations 

    E-print Network

    Obermeyer, Michael Edward

    1993-01-01

    occurred at intersections under flashing operation compared to those operating in the normal mode. A statistical analysis was conducted to determine the safety of flashing signal operation. No significant increases in accidents or accident severity were...

  12. Behavior Analysis: Methodological Foundations.

    ERIC Educational Resources Information Center

    Owen, James L.

    Behavior analysis provides a unique way of coming to understand intrapersonal and interpersonal communication behaviors, and focuses on control techniques available to a speaker and counter-control techniques available to a listener. "Time-series methodology" is a convenient term because it subsumes under one label a variety of baseline or…

  13. QUASAR: a methodology for quantification of uncertainties in severe-accident source terms

    SciTech Connect

    Khatib-Rahbar, M.; Park, C.; Pratt, W.T.; Bari, R.A.; Ryder, C.; Marino, G.

    1986-01-01

    The radiological consequences of severe nuclear reactor accidents are governed, in large part, by the magnitude and characteristics of the radioactivity release, or radiological source term, from the plant. Over the last decade, substantial development and progress have been made in the state of knowledge concerning the nature of severe accidents and associated fission product release and transport. As part of this continuing effort, the US Nuclear Regulatory Commission has sponsored the development of the source term code package (STCP), which models core degradation, fission product release from the damage fuel, and the subsequent migration of the fission products from the primary system to the containment and finally the environment. The purpose of the present paper is to describe a methodology that was developed as part of the Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors (QUASAR) program at Brookhaven National Laboratory. QUASAR is a large program that will apply the methodology described in this paper to severe accident sequences in light water reactors using the STCP.

  14. Severe accident analysis using dynamic accident progression event trees

    Microsoft Academic Search

    Aram P. Hakobyan

    2006-01-01

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order

  15. A methodology for the transfer of probabilities between accident severity categories

    SciTech Connect

    Whitlow, J. D.; Neuhauser, K. S.

    1991-01-01

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships. These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme.

  16. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  17. A CANDU Severe Accident Analysis

    SciTech Connect

    Negut, Gheorghe; Catana, Alexandru [Institute for Nuclear Research, 1, Compului Str., Mioveni, PO Box 78, 0300 Pitesti (Romania); Prisecaru, Ilie [University Politehnica Bucharest (Romania)

    2006-07-01

    As interest in severe accident studies has increased in the last years, we have developed a set of simple models to analyze severe accidents for CANDU reactors that should be integrated in the EU codes. The CANDU600 reactor uses natural uranium fuel and heavy water (D2O) as both moderator and coolant, with the moderator and coolant in separate systems. We chose to analyze accident development for a LOCA with simultaneous loss of moderator cooling and the loss of emergency core cooling system (ECCS). This type of accident is likely to modify the reactor geometry and will lead to a severe accident development. When the coolant temperatures inside a pressure tube reaches 10000 deg C, a contact between pressure tube and calandria tube occurs and the residual heat is transferred to the moderator. Due to the lack of cooling, the moderator eventually begins to boil and is expelled, through the calandria vessel relief ducts, into the containment. Therefore the calandria tubes (fuel channels) will be uncovered, then will disintegrate and fall down to the calandria vessel bottom. After all the quantity of moderator is vaporized and expelled, the debris will heat up and eventually boil. The heat accumulated in the molten debris will be transferred through the calandria vessel wall to the shield tank water, which normally surrounds the calandria vessel. The phenomena described above are modelled, analyzed and compared with the existing data. The results are encouraging. (authors)

  18. Integrating Root Cause Analysis Methodologies

    Microsoft Academic Search

    Leith Hitchcock

    Many Root Cause Analysis (RCA) methodologies have specific applications and limitations and in some case for complex machinery\\u000a investigations they can be combined and enhanced for better results. Typical methodologies that can be combined effectively\\u000a are Kepner Tregoe, Causal Tree Analysis (Apollo), Fault Tree Analysis, Logic Tree Analysis, Barrier Analysis, and Human Performance\\u000a Evaluation amongst others.\\u000a \\u000a The difficulty with many

  19. A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Apostolakis, G. [and others] and others

    1998-04-01

    This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs.

  20. FSAR fire accident analysis for a plutonium facility

    SciTech Connect

    Lam, K.

    1997-06-01

    The Final Safety Analysis Report (FSAR) for a plutonium facility as required by DOE Orders 5480.23 and 5480.22 has recently been completed and approved. The facility processes and stores radionuclides such as Pu-238, Pu-239, enriched uranium, and to a lesser degree other actinides. This facility produces heat sources. DOE Order 5480.23 and DOE-STD-3009-94 require analysis of different types of accidents (operational accidents such as fires, explosions, spills, criticality events, and natural phenomena such as earthquakes). The accidents that were analyzed quantitatively, or the Evaluation Basis Accidents (EBAs), were selected based on a multi-step screening process that utilizes extensively the Hazards Analysis (HA) performed for the facility. In the HA, specific accident scenarios, with estimated frequency and consequences, were developed for each identified hazard associated with facility operations and activities. Analysis of the EBAs and comparison of their consequences to the evaluation guidelines established the safety envelope for the facility and identified the safety-class structures, systems, and components. This paper discusses the analysis of the fire EBA. This fire accident was analyzed in relatively great detail in the FSAR because of its potential off-site consequences are more severe compared to other events. In the following, a description of the scenario is first given, followed by a brief summary of the methodology for calculating the source term. Finally, the author discuss how a key parameter affecting the source term, the leakpath factor, was determined, which is the focus of this paper.

  1. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  2. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  3. Probabilistic Approach to Analysis of Death Traffic Accidents

    Microsoft Academic Search

    Evgenia Suzdaleva; Ivan Nagy

    The paper is devoted to analysis of data related to traffic accidents at one of the roads in Czech Republic. The data sets are available as discrete-valued variables providing results of traffic accident (with death or not) as well as conditions under which the accident has happened (weather, visibility, speed etc). Situation of a traffic accident is modeled within state-space

  4. A Methodology for Assessing the Effect of Countermeasures Against a Nuclear Accident Using Fuzzy Set Theory

    Microsoft Academic Search

    M. H. Han; W. T. Hwang; E. H. Kim; K. S. Suh; Y. G. Choi

    A methodology for assessing the effectiveness of countermeasures against a nuclear accident has been designed by means of the concept of fuzzy set theory. In most of the existing countermeasure models in actions under radiological emergencies, the large variety of possible features is simplified by a number of rough assumptions. During this simplification procedure, a lot of information is lost

  5. Accident patterns for construction-related workers: a cluster analysis

    NASA Astrophysics Data System (ADS)

    Liao, Chia-Wen; Tyan, Yaw-Yauan

    2012-01-01

    The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

  6. Accident patterns for construction-related workers: a cluster analysis

    NASA Astrophysics Data System (ADS)

    Liao, Chia-Wen; Tyan, Yaw-Yauan

    2011-12-01

    The construction industry has been identified as one of the most hazardous industries. The risk of constructionrelated workers is far greater than that in a manufacturing based industry. However, some steps can be taken to reduce worker risk through effective injury prevention strategies. In this article, k-means clustering methodology is employed in specifying the factors related to different worker types and in identifying the patterns of industrial occupational accidents. Accident reports during the period 1998 to 2008 are extracted from case reports of the Northern Region Inspection Office of the Council of Labor Affairs of Taiwan. The results show that the cluster analysis can indicate some patterns of occupational injuries in the construction industry. Inspection plans should be proposed according to the type of construction-related workers. The findings provide a direction for more effective inspection strategies and injury prevention programs.

  7. A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT

    E-print Network

    Leveson, Nancy

    A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT Thesis submitted in partial fulfilment;A STAMP ANALYSIS OF THE LEX COMAIR 5191 ACCIDENT Paul S. Nelson 2 #12;Acknowledgements I want--Theoretic Accident Modeling and Processes (STAMP). It incorporates three basic components: constraints, hierarchical

  8. Minimum Accident of Concern - A Different Basis for CAS Analysis

    SciTech Connect

    Biswas, D.

    2002-01-31

    A Criticality Alarm System is normally designed to detect immediately the minimum accident of concern. This report covers the methodology to establish a different minimum accident of concern developed for shielded facilities and applied to a case of the canyon sump excursion in a Savannah River Site facility.

  9. A review of risk analysis and helicopter air ambulance accidents.

    PubMed

    Nix, Sam; Buckner, Steven; Cercone, Richard

    2014-01-01

    The Federal Aviation Administration announced a final rule in February 2014 that includes a requirement for helicopter air ambulance operators to institute preflight risk analysis programs. This qualitative study examined risk factors that were described in 22 preliminary, factual, and probable cause helicopter air ambulance accident and incident reports that were initiated by the National Transportation Safety Board between January 1, 2011, and December 31, 2013. Insights into the effectiveness of existing preflight risk analysis strategies were gained by comparing these risk factors with the preflight risk analysis guidance that is published by the Federal Aviation Administration in the Flight Standards Information Management System. When appropriate, a deeper understanding of the human factors that may have contributed to occurrences was gained through methodologies that are described in the Human Factors Analysis and Classification System. The results of this study suggest that there are some vulnerabilities in existing preflight risk analysis guidelines that may affect safety in the helicopter air ambulance industry. The likelihood that human factors contributed to most of the helicopter air ambulance accidents and incidents that occurred during the study period was also evidenced. The results of this study suggest that effective risk analysis programs should provide pilots with both preflight and in-flight resources. PMID:25179955

  10. Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

    2012-01-01

    Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

  11. [Hanggliding accidents. Distribution of injuries and accident analysis].

    PubMed

    Ballmer, F T; Jakob, R P

    1989-12-01

    Paragliding--a relatively new sport to Switzerland--brought 23 patients with 48 injuries (38% lower limb and 29% spinal) within a period of 8 months to the Inselspital University hospital in Berne. The aim of the study in characterizing these injuries is to formulate some guidelines towards prevention. With over 90% of accidents occurring at either take off or landing, emphasis on better training for the beginner is proposed with strict guidelines for the more experienced pilot flying in unfavourable conditions. PMID:2617285

  12. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Microsoft Academic Search

    E. D. Gorham; R. J. Breeding; T. D. Brown; F. T. Harper; J. C. Helton; W. B. Murfin; S. C. Hora

    1993-01-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of

  13. A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS

    SciTech Connect

    Palmrose, D E; Yang, J M

    2007-05-10

    The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

  14. PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    (NOEMAIL), K; Jonathan Lowrie, J; David Thoman (NOEMAIL), D; Austin Keller (NOEMAIL), A

    2008-07-30

    Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

  15. TMI-2 accident: core heat-up analysis

    SciTech Connect

    Ardron, K.H.; Cain, D.G.

    1981-01-01

    This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

  16. A methodology for generating dynamic accident progression event trees for level-2 PRA

    SciTech Connect

    Hakobyan, A.; Denning, R.; Aldemir, T. [Ohio State Univ., Nuclear Engineering Program, 650 Ackerman Road, Columbus, OH 43202 (United States); Dunagan, S.; Kunsman, D. [Sandia National Laboratory, Albuquerque, NM 87185 (United States)

    2006-07-01

    Currently, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. A software tool (ADAPT) is described for automated APET generation using the concept of dynamic event trees. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. While the software tool could be applied to any systems analysis code, the MELCOR code is used for this illustration. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a pressurized water reactor. (authors)

  17. NRC's environmental analysis of nuclear accidents: is it adequate. Final report

    Microsoft Academic Search

    E. Entwisle; D. Wexler

    1980-01-01

    The report evaluates the adequacy of accident analyses in environmental impact statements for nuclear power plants. It reviews the regulations and policies governing nuclear accident analyses in EISs, surveys the accident analyses in 149 EISs in 10 years, assesses the legal and scientific foundations of NRC's accident analysis policy, discusses the legal and pragmatic reasons for fuller EIS accident analysis,

  18. [Severe parachuting accident. Analysis of 122 cases].

    PubMed

    Krauss, U; Mischkowsky, T

    1993-06-01

    Based on a population of 122 severely injured patients the causes of paragliding accidents and the patterns of injury are analyzed. A questionnaire is used to establish a sport-specific profile for the paragliding pilot. The lower limbs (55.7%) and the lower parts of the spine (45.9%) are the most frequently injured parts of the body. There is a high risk of multiple injuries after a single accident because of the tremendous axial power. The standard of equipment is good in over 90% of the cases. Insufficient training and failure to take account of geographical and meteorological conditions are the main determinants of accidents sustained by paragliders, most of whom are young. Nevertheless, 80% of our patients want to continue paragliding. Finally some advice is given on how to prevent paragliding accidents and injuries. PMID:8342057

  19. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  20. ADAM: An Accident Diagnostic,Analysis and Management System - Applications to Severe Accident Simulation and Management

    SciTech Connect

    Zavisca, M.J.; Khatib-Rahbar, M.; Esmaili, H. [Energy Research Inc., P.O. Box 2034, Rockville, MD 20847-2034 (United States); Schulz, R. [Swiss Federal Nuclear Safety Inspectorate, Villigen, 5232 (Switzerland)

    2002-07-01

    The Accident Diagnostic, Analysis and Management (ADAM) computer code has been developed as a tool for on-line applications to accident diagnostics, simulation, management and training. ADAM's severe accident simulation capabilities incorporate a balance of mechanistic, phenomenologically based models with simple parametric approaches for elements including (but not limited to) thermal hydraulics; heat transfer; fuel heatup, meltdown, and relocation; fission product release and transport; combustible gas generation and combustion; and core-concrete interaction. The overall model is defined by a relatively coarse spatial nodalization of the reactor coolant and containment systems and is advanced explicitly in time. The result is to enable much faster than real time (i.e., 100 to 1000 times faster than real time on a personal computer) applications to on-line investigations and/or accident management training. Other features of the simulation module include provision for activation of water injection, including the Engineered Safety Features, as well as other mechanisms for the assessment of accident management and recovery strategies and the evaluation of PSA success criteria. The accident diagnostics module of ADAM uses on-line access to selected plant parameters (as measured by plant sensors) to compute the thermodynamic state of the plant, and to predict various margins to safety (e.g., times to pressure vessel saturation and steam generator dryout). Rule-based logic is employed to classify the measured data as belonging to one of a number of likely scenarios based on symptoms, and a number of 'alarms' are generated to signal the state of the reactor and containment. This paper will address the features and limitations of ADAM with particular focus on accident simulation and management. (authors)

  1. Perspectives on Nonconventional Job Analysis Methodologies

    Microsoft Academic Search

    Erich P. Prien; Kristin O. Prien; Louis G. Gamble

    2004-01-01

    The nonconventional approaches in this paper can be applied to various job content domains, including work activity data and other job descriptors of job skills\\/competency data. The nonconventional designs are based on the inverse factor analysis methodology used by Thurstone (1951) in his classic study of Supreme Court judge decisions. Hemphill (1959) and Tucker (1958) also expanded this methodology. Application

  2. FAD: A FUNCTIONAL ANALYSIS AND DESIGN METHODOLOGY

    E-print Network

    Kent, University of

    FAD: A FUNCTIONAL ANALYSIS AND DESIGN METHODOLOGY a thesis submitted to The University of Kent methodology FAD. By func- tional we mean that it naturally supports software development within the functional- ous pictorial representations of a system. FAD's modelling language provides the typical elements

  3. Human body modelling for traffic accident analysis

    Microsoft Academic Search

    S. Krašna; I. Prebil; M. Hribernik

    2007-01-01

    A traffic accident is a complex phenomenon with vehicles and human beings involved. During a collision, the vehicle occupant is exposed to substantial loads, which can cause the occupant injuries that depend on the level of passive safety, as well as on the occupant's individual characteristics. Correct estimation of injury severity demands a validated human body model and known impact

  4. A methodology for optimisation of countermeasures for animal products after a nuclear accident and its application

    Microsoft Academic Search

    Won Tae Hwang; Gyuseong Cho; Moon Hee Han

    1999-01-01

    A methodology for the optimisation of the countermeasures associated with the contamination of animal products was designed based on cost–benefit analysis. Results are discussed for the hypothetical deposition of radionuclides on 15 August, when pastures are fully developed in Korean agricultural conditions. A dynamic food chain model, DYNACON, was used to evaluate the effectiveness of the countermeasures for reducing the

  5. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  6. Analysis of tritium mission FMEF/FAA fuel handling accidents

    SciTech Connect

    Van Keuren, J.C.

    1997-11-18

    The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

  7. OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT

    SciTech Connect

    KRIPPS, L.J.

    2005-02-18

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  8. The Methodology of Data Envelopment Analysis.

    ERIC Educational Resources Information Center

    Sexton, Thomas R.

    1986-01-01

    The methodology of data envelopment analysis, (DEA) a linear programming-based method, is described. Other procedures often used for measuring relative productive efficiency are discussed in relation to DEA, including ratio analysis and multiple regression analysis. The DEA technique is graphically illustrated for only two inputs and one output.…

  9. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.

  10. Analysis Methodology for Industrial Load Profiles

    E-print Network

    Reddoch, T. W.

    ANALYSIS METHODOLOGY FOR INDUSTRIAL LOAD PROFILES Thomas W. Reddoch Executive Vice President Eleclrolek Concepts, Inc. Knoxvillc, Tennessee ABSTRACT A methodology is provided for evaluating the impact of various demand-side management... (OSM) options on industrial customers. The basic approach uses customer metered load profile data as a basis for the customer load shape. OSM technologies are represented as load shapes and are used as a basis for altering the customers existing...

  11. Human factors review for Severe Accident Sequence Analysis (SASA)

    SciTech Connect

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure.

  12. MELCOR accident analysis for ARIES-ACT

    SciTech Connect

    Paul W. Humrickhouse; Brad J. Merrill

    2012-08-01

    We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

  13. STAMP-Based Analysis of a Refinery Overflow Accident Nancy Leveson, Margaret Stringfellow, and John Thomas

    E-print Network

    Leveson, Nancy

    1 STAMP-Based Analysis of a Refinery Overflow Accident Nancy Leveson, Margaret Stringfellow, and John Thomas As an example of STAMP, we have taken an accident report produced for a real refinery overflow accident and reanalyzed it using STAMP. The original accident report is shown in the Appendix

  14. Analysis of the 1957-58 Soviet nuclear accident

    Microsoft Academic Search

    J. R. Trabalka; L. D. Eyman; S. I. Auerbach

    1979-01-01

    The occurrence of a Soviet accident in the winter of 1957-58, involving the atmospheric release of reprocessed fission wastes (cooling time approximately 1-2 yrs.), appears to have been confirmed, primarily by an analysis of the USSR radioecology literature. Due to the high population density in the affected region (Cheliabinsk Province in the highly industrialized Urals Region) and the reported level

  15. Corporate cost of occupational accidents: an activity-based analysis

    Microsoft Academic Search

    Pall M. Rikhardsson; Martin Impgaard

    2004-01-01

    The systematic accident cost analysis (SACA) project was carried out during 2001 by The Aarhus School of Business and PricewaterhouseCoopers Denmark with financial support from The Danish National Working Environment Authority. Its focused on developing and testing a method for evaluating occupational costs of companies for use by occupational health and safety professionals. The method was tested in nine Danish

  16. QUASAR: a methodology for quantification of uncertainties in severe-accident source terms

    Microsoft Academic Search

    M. Khatib-Rahbar; W. T. Pratt; R. A. Bari; C. Ryder; G. Marino

    1986-01-01

    The radiological consequences of severe nuclear reactor accidents are governed, in large part, by the magnitude and characteristics of the radioactivity release, or radiological source term, from the plant. Over the last decade, substantial development and progress have been made in the state of knowledge concerning the nature of severe accidents and associated fission product release and transport. As part

  17. Maanshan T sub p S sub L B accident analysis

    Microsoft Academic Search

    C. C. Chen; T. K. Wang; J. K. Hsueh

    1988-01-01

    An T{sub p}S{sub L}B accident analysis for Taipower's Maanshan Unit 1 plant is reported. The plant is a 2775-MW(thermal) pressurized water reactor with large dry containment. Based on Maanhshan level-1 probabilistic risk assessment, the T{sub p}S{sub L}B sequence ranks first in accident frequency. The basic definition of T{sub p}S{sub L}B includes loss-of-off-site power T{sub p}, station blackout, loss of all

  18. Core Disruptive Accident Analysis using ASTERIA-FBR

    NASA Astrophysics Data System (ADS)

    Ishizu, Tomoko; Endo, Hiroshi; Yamamoto, Toshihisa; Tatewaki, Isao

    2014-06-01

    JNES is developing a core disruptive accident analysis code, ASTERIA-FBR, which tightly couples the thermal-hydraulics and the neutronics to simulate the core behavior during core disruptive accidents of fast breeder reactors (FBRs). ASTERIA-FBR consists of the three-dimensional thermal-hydraulics calculation module: CONCORD, the fuel pin behavior calculation module: FEMAXI-FBR, and the space-time neutronics module: Dynamic-GMVP or PARTISN/RKIN. This paper describes a comparison between characteristics of GMVP and PARTISN and summarizes the challenging issues on applying Dynamic-GMVP to the calculation against unprotected loss-of-flow (ULOF) event which is a typical initiator of core disruptive accident of FBR. The statistical error included in the calculation results may affect the super-prompt criticality during ULOF event and thus the amount of released energy.

  19. Three dimensional effects in analysis of PWR steam line break accident

    E-print Network

    Tsai, Chon-Kwo

    A steam line break accident is one of the possible severe abnormal transients in a pressurized water reactor. It is required to present an analysis of a steam line break accident in the Final Safety Analysis Report (FSAR) ...

  20. The Methodology of Search Log Analysis

    E-print Network

    Jansen, James

    EVIEW OF LItErAtUrE What is a search Log? Not surprisingly, a search log is a file (i.e., log99 Chapter VI The Methodology of Search Log Analysis Bernard J. Jansen Pennsylvania State permission of IGI Global is prohibited. AbstrAct Exploiting the data stored in search logs of Web search

  1. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  2. INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS

    SciTech Connect

    D.A. Kalinich

    1999-09-27

    Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

  3. APR1400 Reactivity Insertion Accident Analysis Using KNAP

    SciTech Connect

    Chang-Keun, Yang; Yo-Han, Kim; Chang-Kyung, Sung [Korea Electric Power Research Institute, 103-16 Munji-Dong, Yusong-Gu, Daejon, 305-380 (Korea, Republic of)

    2006-07-01

    The Korea Electric Power Research Institute had decided to develop the new safety analysis code system for the Optimized Power Reactor 1000 (OPR1000) in Korea by the fund of the Ministry of Commerce, Industry and Energy. In this paper, some results of the Advanced Power Reactor 1400(APR1400) using the RETRAN code for some reactivity insertion accident are introduced to expand application from safety analysis experience of OPR1000. (authors)

  4. Fractal analysis: methodologies for biomedical researchers.

    PubMed

    Ristanovi?, Dusan; Milosevi?, Nebojsa T

    2012-01-01

    Fractal analysis has become a popular method in all branches of scientific investigations including biology and medicine. Although there is a growing interest in the application of fractal analysis in biological sciences, questions about the methodology of fractal analysis have partly restricted its wider and comprehensible application. It is a notable fact that fractal analysis is derived from fractal geometry, but there are some unresolved issues that need to be addressed. In this respect, we discuss several related underlying principles for fractal analysis and establish the meaningful relationship between fractal analysis and fractal geometry. Since some concepts in fractal analysis are determined descriptively and/or qualitatively, this paper provides their exact mathematical definitions or explanations. Another aim of this study is to show that nowadays fractal analysis is an independent mathematical and experimental method based on Mandelbrot's fractal geometry, Euclidean traditiontal geometry and Richardson's coastline method. PMID:23757956

  5. An analysis of human factors in traffic accidents using the variation tree method

    Microsoft Academic Search

    Toshiro Ishida; Naoya Kanda

    1999-01-01

    The purpose of our research is to develop a new method for the analysis of traffic accidents and to examine the effectiveness of that method. We modified the variation tree method, which was proposed in cognitive science, so that it could be applied to the analysis of the human factors in traffic accidents. We analyzed in particular accidents that occurred

  6. Gap Analysis Methodology for Business Service Engineering

    Microsoft Academic Search

    Dinh Khoa Nguyen; Willem-jan Van Den Heuvel; Mike P. Papazoglou; Valeria De Castro; Esperanza Marcos

    2009-01-01

    Many of todaypsilas service analysis and design techniques rely on ad-hoc and experience-based identification of value-creating business services and implicitly assume a ldquogreen-fieldrdquo situation focusing on the development of completely new services while offering very limited support for discovering candidate services from pre-existing software assets. In this article, we introduce a novel business service engineering methodology that identifies and conceptualizes

  7. Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

  8. Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

  9. Offsite radiological consequence analysis for the bounding flammable gas accident

    SciTech Connect

    CARRO, C.A.

    2003-03-19

    The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. As will be shown, the consequences of a detonation in either an SST or a double-shell tank (DST) are approximately equal. A detonation in an SST was selected as the bounding condition because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are generally greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  10. Analysis of PWR RCS Injection Strategy During Severe Accident

    SciTech Connect

    Wang, S.-J. [Institute of Nuclear Energy Research, Taiwan (China); Chiang, K.-S. [Institute of Nuclear Energy Research, Taiwan (China); Chiang, S.-C. [Taiwan Power Company, Taiwan (China)

    2004-05-15

    Reactor coolant system (RCS) injection is an important strategy for severe accident management of a pressurized water reactor (PWR) system. Maanshan is a typical Westinghouse PWR nuclear power plant (NPP) with large, dry containment. The severe accident management guideline (SAMG) of Maanshan NPP is developed based on the Westinghouse Owners Group (WOG) SAMG.The purpose of this work is to analyze the RCS injection strategy of PWR system in an overheated core condition. Power is assumed recovered as the vessel water level drops to the bottom of active fuel. The Modular Accident Analysis Program version 4.0.4 (MAAP4) code is chosen as a tool for analysis. A postulated station blackout sequence for Maanshan NPP is cited as a reference case for this analysis. The hot leg creep rupture occurs during the mitigation action with immediate injection after power recovery according to WOG SAMG, which is not desired. This phenomenon is not considered while developing the WOG SAMG. Two other RCS injection methods are analyzed by using MAAP4. The RCS injection strategy is modified in the Maanshan SAMG. These results can be applied for typical PWR NPPs.

  11. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  12. Three Dimensional Analysis of 3-Loop PWR RCCA Ejection Accident for High Burnup

    SciTech Connect

    Marciulescu, Cristian; Sung, Yixing; Beard, Charles L. [Westinghouse Electric Company, LLC (United States)

    2006-07-01

    The Rod Control Cluster Assembly (RCCA) ejection accident is a Condition IV design basis reactivity insertion event for Pressurized Water Reactors (PWR). The event is historically analyzed using a one-dimensional (1D) neutron kinetic code to meet the current licensing criteria for fuel rod burnup to 62,000 MWD/MTU. The Westinghouse USNRC-approved three-dimensional (3D) analysis methodology is based on the neutron kinetics version of the ANC code (SPNOVA) coupled with Westinghouse's version of the EPRI core thermal-hydraulic code VIPRE-01. The 3D methodology provides a more realistic yet conservative analysis approach to meet anticipated reduction in the licensing fuel enthalpy rise limit for high burnup fuel. A rod ejection analysis using the 3D methodology was recently performed for a Westinghouse 3-loop PWR at an up-rated core power of 3151 MWt with reload cores that allow large flexibility in assembly shuffling and a fuel hot rod burnup to 75,000 MWD/MTU. The analysis considered high enrichment fuel assemblies at the control rod locations as well as bounding rodded depletions in the end of life, zero power and full power conditions. The analysis results demonstrated that the peak fuel enthalpy rise is less than 100 cal/g for the transient initiated at the hot zero power condition. The maximum fuel enthalpy is less than 200 cal/g for the transient initiated from the full power condition. (authors)

  13. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    PubMed

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2014-09-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended. PMID:25179119

  14. A DISCIPLINED APPROACH TO ACCIDENT ANALYSIS DEVELOPMENT AND CONTROL SELECTION

    SciTech Connect

    Ortner, T; Mukesh Gupta, M

    2007-04-13

    The development and use of a Safety Input Review Committee (SIRC) process promotes consistent and disciplined Accident Analysis (AA) development to ensure that it accurately reflects facility design and operation; and that the credited controls are effective and implementable. Lessons learned from past efforts were reviewed and factored into the development of this new process. The implementation of the SIRC process has eliminated many of the problems previously encountered during Safety Basis (SB) document development. This process has been subsequently adopted for use by several Savannah River Site (SRS) facilities with similar results and expanded to support other analysis activities.

  15. Analysis of Three Mile Island-Unit 2 accident

    SciTech Connect

    Not Available

    1980-03-01

    The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

  16. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  17. An empirical study for mines safety management through analysis on potential for accident reduction

    Microsoft Academic Search

    S. Mallick; K. Mukherjee

    1996-01-01

    The effective utilization of resources in seeking to reduce accidents in mines requires that the accident experiences of different mines should first be placed on a comparative footing. There could be many characteristics of belowground mines which influence the occurrence of accidents. Depending on the objective of the analysis, some of these characteristics can be treated as fixed, allowing least

  18. An Integrated Accident & Consequence Analysis Approach for Accidental Releases through Multiple Leak Paths

    SciTech Connect

    POLIZZI, LM

    2004-04-28

    This paper presents a consequence analysis for a postulated fire accident on a building containing plutonium when the resulting outside release is partly through the ventilation/filtration system and partly through other pathways such as building access doorways. When analyzing an accident scenario involving the release of radioactive powders inside a building, various pathways for the release to the outside environment can exist. This study is presented to guide the analyst on how the multiple building leak path factors (combination of filtered and unfiltered releases) can be evaluated in an integrated manner starting with the source term calculation and proceeding through the receptor consequence determination. The analysis is performed in a two-step process. The first step of the analysis is to calculate the leak path factor, which represents the fraction of respirable radioactive powder that is made airborne that leaves the building through the various pathways. The computer cod e of choice for this determination is MELCOR. The second step is to model the transport and dispersion of powder material released to the atmosphere and to estimate the resulting dose that is received by the downwind receptors of interest. The MACCS computer code is chosen for this part of the analysis. This work can be used as model for performing analyses for systems similar in nature where releases can propagate to the outside environment via filtered and unfiltered pathways. The methodology provides guidance to analysts outlining the essential steps needed to perform a sound and defensible consequence analysis.

  19. Use of Human Factors Analysis for Wildland Fire Accident Investigations

    Microsoft Academic Search

    Michelle Ryerson; Chuck Whitlock

    2005-01-01

    Accident investigators at any level are challenged with identifying causal factors and making preventative recommendations. This task can be particularly complicated considering that 70-80% of accidents are associated with human error. Due to complexities of the wildland fire environment, this is especially challenging when investigating a wildland fire-related accident. Upon reviewing past accident investigations within the United States Federal wildland

  20. The prevention of slipping accidents: a review and discussion of work related to the methodology of measuring slip resistance

    Microsoft Academic Search

    S. Leclercq

    1999-01-01

    The recommendations made after the analysis of accidents following an incident of slipping often include the use of anti-slip footwear and\\/or the installation of an anti-slip floor covering. Such recommendations make it necessary to study biomechanical and tribologic phenomena that occur during slipping, in particular in order to develop criteria for the evaluation of the slip resistance of footwear and

  1. Disposal Critcality Analysis Methodology: BWR Benchmarks

    SciTech Connect

    D.P. Henderson; D.A. Salmon

    1999-08-01

    Computer code benchmarks using commercial reactor critical (CRC) data for boiling water reactor (BWR) fuel assemblies using the SCALE and MCNP code packages have been conducted. Depleted fuel inventories which take into account actinide and fission product concentrations are used to develop reactor critical models and the associated neutron multiplication factors. Bias calculated from this integral benchmark method will be applied to the disposal criticality analysis methodology to ensure the sub-criticality of spent commercial nuclear fuel forecast for emplacement into the proposed geologic repository at Yucca Mountain. Previous CRC benchmark calculations have been performed for startup tests for Cycles 13 and 14 of the Quad Cities Unit 2 BWR. Additional benchmarking activities have been performed and applied to evaluations of beginning-of-cycle (BOC) reactor critical models for Cycles 7 and 8 of the LaSalle Unit 1 BWR. Similar to the methodology used for ensuring sub-critical margin for spent nuclear fuel shipping casks, the proposed criticality analysis approach computes the neutron multiplication factor of arbitrary fuel assemblies placed in spent fuel waste packages that represents a bounding criticality model. This is accomplished by calculating spent fuel inventories with the SAS2H sequence of the SCALE code package and computing the neutron multiplication of the spent fuel assemblies in the waste package with MCNP.

  2. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  3. A general methodology for population analysis

    NASA Astrophysics Data System (ADS)

    Lazov, Petar; Lazov, Igor

    2014-12-01

    For a given population with N - current and M - maximum number of entities, modeled by a Birth-Death Process (BDP) with size M+1, we introduce utilization parameter ?, ratio of the primary birth and death rates in that BDP, which, physically, determines (equilibrium) macrostates of the population, and information parameter ?, which has an interpretation as population information stiffness. The BDP, modeling the population, is in the state n, n=0,1,…,M, if N=n. In presence of these two key metrics, applying continuity law, equilibrium balance equations concerning the probability distribution pn,  n=0,1,…,M, of the quantity N, pn=Prob{N=n}, in equilibrium, and conservation law, and relying on the fundamental concepts population information and population entropy, we develop a general methodology for population analysis; thereto, by definition, population entropy is uncertainty, related to the population. In this approach, what is its essential contribution, the population information consists of three basic parts: elastic (Hooke's) or absorption/emission part, synchronization or inelastic part and null part; the first two parts, which determine uniquely the null part (the null part connects them), are the two basic components of the Information Spectrum of the population. Population entropy, as mean value of population information, follows this division of the information. A given population can function in information elastic, antielastic and inelastic regime. In an information linear population, the synchronization part of the information and entropy is absent. The population size, M+1, is the third key metric in this methodology. Namely, right supposing a population with infinite size, the most of the key quantities and results for populations with finite size, emerged in this methodology, vanish.

  4. Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.

    PubMed

    Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

    2014-04-01

    According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

  5. Offsite radiological consequence analysis for the bounding aircraft crash accident

    SciTech Connect

    OBERG, B.D.

    2003-03-22

    The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

  6. Cost analysis methodology: Photovoltaic Manufacturing Technology Project

    SciTech Connect

    Whisnant, R.A. (Research Triangle Inst., Research Triangle Park, NC (United States))

    1992-09-01

    This report describes work done under Phase 1 of the Photovoltaic Manufacturing Technology (PVMaT) Project. PVMaT is a five-year project to support the translation of research and development in PV technology into the marketplace. PVMaT, conceived as a DOE/industry partnership, seeks to advanced PV manufacturing technologies, reduce PV module production costs, increase module performance, and expand US commercial production capacities. Under PVMaT, manufacturers will propose specific manufacturing process improvements that may contribute to the goals of the project, which is to lessen the cost, thus hastening entry into the larger scale, grid-connected applications. Phase 1 of the PVMaT project is to identify obstacles and problems associated with manufacturing processes. This report describes the cost analysis methodology required under Phase 1 that will allow subcontractors to be ranked and evaluated during Phase 2.

  7. Analysis of Reactivity Induced Accidents for HTR-10

    Microsoft Academic Search

    KOSE Serhat; KILIC Ihsan

    Initiating events for accidents has been classified into five groups for HTR-10, by Chinese Authorities. The fourth group (Class IV) events are considered as Design Basis Accident (DBA) cases regarding their consequences. There are three types of initiating events in Class-IV, resulting in reactivity accidents: \\

  8. Analysis of surface powered haulage accidents, January 1990--July 1996

    SciTech Connect

    Fesak, G.M.; Breland, R.M.; Spadaro, J. [Dept. of Labor, Arlington, VA (United States)

    1996-12-31

    This report addresses surface haulage accidents that occurred between January 1990 and July 1996 involving haulage trucks (including over-the-road trucks), front-end-loaders, scrapers, utility trucks, water trucks, and other mobile haulage equipment. The study includes quarries, open pits and surface coal mines utilizing self-propelled mobile equipment to transport personnel, supplies, rock, overburden material, ore, mine waste, or coal for processing. A total of 4,397 accidents were considered. This report summarizes the major factors that led to the accidents and recommends accident prevention methods to reduce the frequency of these accidents.

  9. Summary of the SRS Severe Accident Analysis Program, 1987--1992

    SciTech Connect

    Long, T.A.; Hyder, M.L.; Britt, T.E.; Allison, D.K.; Chow, S.; Graves, R.D.; DeWald, A.B. Jr.; Monson, P.R. Jr.; Wooten, L.A.

    1992-11-01

    The Severe Accident Analysis Program (SAAP) is a program of experimental and analytical studies aimed at characterizing severe accidents that might occur in the Savannah River Site Production Reactors. The goals of the Severe Accident Analysis Program are: To develop an understanding of severe accidents in SRS reactors that is adequate to support safety documentation for these reactors, including the Safety Analysis Report (SAR), the Probabilistic Risk Assessment (PRA), and other studies evaluating the safety of reactor operation; To provide tools and bases for the evaluation of existing or proposed safety related equipment in the SRS reactors; To provide bases for the development of accident management procedures for the SRS reactors; To develop and maintain on the site a sufficient body of knowledge, including documents, computer codes, and cognizant engineers and scientists, that can be used to authoritatively resolve questions or issues related to reactor accidents. The Severe Accident Analysis Program was instituted in 1987 and has already produced a substantial amount of information, and specialized calculational tools. Products of the Severe Accident Analysis Program (listed in Section 9 of this report) have been used in the development of the Safety Analysis Report (SAR) and the Probabilistic Risk Assessment (PRA), and in the development of technical specifications for the SRS reactors. A staff of about seven people is currently involved directly in the program and in providing input on severe accidents to other SRS activities.

  10. A Review of Citation Analysis Methodologies for Collection Management

    ERIC Educational Resources Information Center

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  11. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  12. Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System

    SciTech Connect

    WILLIAMS, J.C.

    2000-09-15

    Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

  13. Fault seal analysis: Methodology and case studies

    SciTech Connect

    Badley, M.E.; Freeman, B.; Needham, D.T. [Earth Sciences Limited, Lincolnshire (United Kingdom)

    1996-12-31

    Fault seal can arise from reservoir/non-reservoir juxtaposition or by development of fault rock of high entry-pressure. The methodology for evaluating these possibilities uses detailed seismic mapping and well analysis. A {open_quote}first-order{close_quote} seal analysis involves identifying reservoir juxtaposition areas over the fault surface, using the mapped horizons and a refined reservoir stratigraphy defined by isochores at the fault surface. The {open_quote}second-order{close_quote} phase of the analysis assesses whether the sand-sand contacts are likely to support a pressure difference. We define two lithology-dependent attributes {open_quote}Gouge Ratio{close_quote} and {open_quote}Smear Factor{close_quote}. Gouge Ratio is an estimate of the proportion of fine-grained material entrained into the fault gouge from the wall rocks. Smear Factor methods estimate the profile thickness of a ductile shale drawn along the fault zone during faulting. Both of these parameters vary over the fault surface implying that faults cannot simply be designated {open_quote}sealing{close_quote} or {open_quote}non-sealing{close_quote}. An important step in using these parameters is to calibrate them in areas where across-fault pressure differences are explicitly known from wells on both sides of a fault. Our calibration for a number of datasets shows remarkably consistent results despite their diverse settings (e.g. Brent Province, Niger Delta, Columbus Basin). For example, a Shale Gouge Ratio of c. 20% (volume of shale in the slipped interval) is a typical threshold between minimal across-fault pressure difference and significant seal.

  14. Fault seal analysis: Methodology and case studies

    SciTech Connect

    Badley, M.E.; Freeman, B.; Needham, D.T. (Earth Sciences Limited, Lincolnshire (United Kingdom))

    1996-01-01

    Fault seal can arise from reservoir/non-reservoir juxtaposition or by development of fault rock of high entry-pressure. The methodology for evaluating these possibilities uses detailed seismic mapping and well analysis. A [open quote]first-order[close quote] seal analysis involves identifying reservoir juxtaposition areas over the fault surface, using the mapped horizons and a refined reservoir stratigraphy defined by isochores at the fault surface. The [open quote]second-order[close quote] phase of the analysis assesses whether the sand-sand contacts are likely to support a pressure difference. We define two lithology-dependent attributes [open quote]Gouge Ratio[close quote] and [open quote]Smear Factor[close quote]. Gouge Ratio is an estimate of the proportion of fine-grained material entrained into the fault gouge from the wall rocks. Smear Factor methods estimate the profile thickness of a ductile shale drawn along the fault zone during faulting. Both of these parameters vary over the fault surface implying that faults cannot simply be designated [open quote]sealing[close quote] or [open quote]non-sealing[close quote]. An important step in using these parameters is to calibrate them in areas where across-fault pressure differences are explicitly known from wells on both sides of a fault. Our calibration for a number of datasets shows remarkably consistent results despite their diverse settings (e.g. Brent Province, Niger Delta, Columbus Basin). For example, a Shale Gouge Ratio of c. 20% (volume of shale in the slipped interval) is a typical threshold between minimal across-fault pressure difference and significant seal.

  15. Advanced accident sequence precursor analysis level 1 models

    SciTech Connect

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O. [Idaho National Engineering Lab., Idaho National Lab., Idaho Falls, ID (United States)

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  16. The accident analysis of mobile mine machinery in Indian opencast coal mines.

    PubMed

    Kumar, R; Ghosh, A K

    2014-01-01

    This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines. PMID:23324038

  17. MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents

    SciTech Connect

    Foppe, T.L.; Peterson, V.L.

    1993-10-01

    The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

  18. An analysis to determine correlations of freeway traffic accidents with specific geometric design features

    E-print Network

    Smith, Frank Miller

    1960-01-01

    fulfillment of the requirements for the degree of MASTER OF SCIENCE August, 1960 Major Subject: Civil Engineering AN ANALYSIS TO DETERMINE CORRELATIONS OF FREEWAY TRAFFIC ACCIDENTS WITH 'SPECIFIC GEOMETRIC DESIGN FEATURES A Thesis By Frank Miller... of control" ) were investigated for assignable causes. It was reported that such causes were traced in 86%%d of the locations having such accidents. Other techniques have been employed as traffic engineers and other accident investigators have made wider...

  19. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    SciTech Connect

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun [Korea Electric Power Research Institute (Korea, Republic of)

    2004-10-15

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report.

  20. Techniques and methodologies for risk analysis in chemical process industries

    Microsoft Academic Search

    Faisal I. Khan; S. A. Abbasi

    1998-01-01

    This paper presents a state-of-art-review of the available techniques and methodologies for carrying out risk analysis in chemical process industries. It also presents a set of methodologies developed by the authors to conduct risk analysis effectively and optimally.

  1. Interferometric data analysis based on Markov nonlinear filtering methodology

    E-print Network

    Arleo, Angelo

    . The essence of interferometric data analysis is the so- lution of the nonlinear inverse problem of the phaseInterferometric data analysis based on Markov nonlinear filtering methodology Igor P. Gurov. We propose a new interferometric data processing methodology based on a recurrent nonlinear procedure

  2. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    SciTech Connect

    Su'ud, Zaki; Anshari, Rio [Nuclear and Biophysics Research Group, Dept. of Physics, Bandung Institute of Technology, Jl.Ganesha 10, Bandung, 40132 (Indonesia)

    2012-06-06

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  3. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  4. Effects of road lighting: an analysis based on Dutch accident statistics 1987-2006.

    PubMed

    Wanvik, Per Ole

    2009-01-01

    This study estimates the safety effect of road lighting on accidents in darkness on Dutch roads, using data from an interactive database containing 763,000 injury accidents and 3.3 million property damage accidents covering the period 1987-2006. Two estimators of effect are used, and the results are combined by applying techniques of meta-analysis. Injury accidents are reduced by 50%. This effect is larger than the effects found in most of the earlier studies. The effect on fatal accidents is slightly larger than the effect on injury accidents. The effect during twilight is about 2/3 of the effect in darkness. The effect of road lighting is significantly smaller during adverse weather and road surface conditions than during fine conditions. The effects on pedestrian, bicycle and moped accidents are significantly larger than the effects on automobile and motorcycle accidents. The risk of injury accidents was found to increase in darkness. The average increase in risk was estimated to 17% on lit rural roads and 145% on unlit rural roads. The average increase in risk during rainy conditions is about 50% on lit rural roads and about 190% on unlit rural roads. The average increase in risk with respect to pedestrian accidents is about 140% on lit rural roads and about 360% on unlit rural roads. PMID:19114146

  5. Analysis Methodology for Industrial Load Profiles 

    E-print Network

    Reddoch, T. W.

    1991-01-01

    A methodology is provided for evaluating the impact of various demand-side management (DSM) options on industrial customers. The basic approach uses customer metered load profile data as a basis for the customer load shape. DSM technologies...

  6. Progress in accident analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Latkowski, J F; Gomez del Rio, J; Sanz, J

    2000-10-11

    The present work continues our effort to perform an integrated safety analysis for the HYLIFE-II inertial fusion energy (IFE) power plant design. Recently we developed a base case for a severe accident scenario in order to calculate accident doses for HYLIFE-II. It consisted of a total loss of coolant accident (LOCA) in which all the liquid flibe (Li{sub 2}BeF{sub 4}) was lost at the beginning of the accident. Results showed that the off-site dose was below the limit given by the DOE Fusion Safety Standards for public protection in case of accident, and that his dose was dominated by the tritium released during the accident.

  7. Analysis of the Three Mile Island accident and alternative sequences

    Microsoft Academic Search

    R. O. Wooton; R. S. Denning; P. Cybulskis

    1980-01-01

    A number of analyses were performed with the MARCH computer code to assist the TMI Special Inquiry Group. The MARCH code predicts the thermal and hydraulic conditions in the reactor primary system and containment building in core meltdown accidents. The purpose of the analyses was to examine a number of variations in system operation in the TMI accident to evaluate

  8. ANALYSIS OF JCO CRITICALITY ACCIDENT FROM VIEWPOINT OF RISK MANAGEMENT

    Microsoft Academic Search

    Satoshi Kurita

    The uranium criticality incident at a JCO plant in 1999 was Japan's worst-ever nuclear accident (INES Level 4). Sixty-nine persons were exposed to radiation, and of those two died. SMM, JCO's holding company, paid out about 11 million dollars in compensation to people and companies in the affected area. The direct cause of the accident was very clear: use of

  9. GPHS-RTG launch accident analysis for Galileo and Ulysses

    SciTech Connect

    Bradshaw, C.T. (General Electric Company, Astro-Space Division, P.O. Box 8555, Philadelphia, Pennsylvania 19101 (US))

    1991-01-01

    This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. The National Aeronautics and Space Administration (NASA) provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

  10. Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny

    E-print Network

    Kwatny, Harry G.

    Aircraft Accident Prevention: Loss-of-Control Analysis Harry G. Kwatny , Jean-Etienne T. Dongmo NASA Langley Research Center, MS 161, Hampton, VA, 23681. The majority of fatal aircraft accidents the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft

  11. Analysis of accidents in Greek shipping during the pre- and post-ISM period

    Microsoft Academic Search

    Ernestos Tzannatos; Dimitris Kokotos

    2009-01-01

    In shipping, safety depends on the reliability of the technical and human components of the ship-system, although the marine environment itself may sometimes be so hostile as to give rise to accidents that are beyond technical and human control. The need for a continuous analysis of shipping accidents is dictated by the accumulated evidence for the predominance of the human

  12. RADIS - a regional nuclear accident consequence analysis model for Hong Kong

    Microsoft Academic Search

    Mankit Ray Yeung; E. M. K. Ching

    1993-01-01

    An atmospheric dispersion and consequence model called RADIS has been developed by the University of Hong Kong for nuclear accident consequence analysis. The model uses a two-dimensional plume trajectory derived from wind data for Hong Kong. Dose, health effects, and demographic models are also developed and implemented in RADIS so that accident consequences in 15 major population centers of Greater

  13. Safety and Response-Time Analysis of an Automotive Accident Assistance Service

    E-print Network

    Gilmore, Stephen

    Safety and Response-Time Analysis of an Automotive Accident Assistance Service Ashok Argent of the service which they provide in terms of both its correctness of function and its speed of response. One way the on-board diagnostic and communication systems in high-end cars to provide an accident assistance

  14. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan (Oak Ridge National Laboratory, Oak Ridge, TN); Jeong, Hae-Yong (Korea Atomic Energy Research Institute, Daejeon, Korea); Wigeland, Roald (Idaho National Laboratory, Idaho Falls, ID); Corradini, Michael (University of Wisconsin, Madison, WI); Schmidt, Rodney Cannon; Thomas, Justin (Argonne National Laboratory, Argonne, IL); Wei, Tom (Argonne National Laboratory, Argonne, IL); Sofu, Tanju (Argonne National Laboratory, Argonne, IL); Ludewig, Hans (Brookhaven National Laboratory, Upton, NY); Tobita, Yoshiharu (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Ohshima, Hiroyuki (Japan Atomic Energy Agency, Ibaraki-ken, Japan); Serre, Frederic (Centre d'%C3%94etudes nucl%C3%94eaires de Cadarache %3CU%2B2013%3E CEA, France)

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the experienced user-base and the experimental validation base was decaying away quickly.

  15. PWR integrated safety analysis methodology using multi-level coupling algorithm

    NASA Astrophysics Data System (ADS)

    Ziabletsev, Dmitri Nickolaevich

    Coupled three-dimensional (3D) neutronics/thermal-hydraulic (T-H) system codes give a unique opportunity for a realistic modeling of the plant transients and design basis accidents (DBA) occurring in light water reactors (LWR). Examples of such DBAs are the rod ejection accidents (REA) and the main steam line break (MSLB) that constitute the bounding safety problems for pressurized water reactors (PWR). These accidents involve asymmetric 3D spatial neutronic and T-H effects during the course of the transients. The thermal margins (the peak fuel temperature, and departure from nucleate boiling ratio (DNBR)) are the measures of safety at a particular transient and need to be evaluated as accurate as possible. Modern 3D neutronics/T-H coupled codes estimate the safety margins coarsely on an assembly level, i.e. for an average fuel pin. More accurate prediction of the safety margins requires the evaluation of the transient fuel rod response involving locally coupled neutronics/T-H calculations. The proposed approach is to perform an on-line hot-channel safety analysis not for the whole core but for a selected local region, for example for the highest power loaded fuel assembly. This approach becomes feasible if an on-line algorithm capable to extract the necessary input data for a sub-channel module is available. The necessary input data include the detailed pin-power distributions and the T-H boundary conditions for each sub-channel in the considered problem. Therefore, two potential challenges are faced in the development of refined methodology for evaluation of local safety parameters. One is the development of an efficient transient pin-power reconstruction algorithm with a consistent cross-section modeling. The second is the development of a multi-level coupling algorithm for the T-H boundary and feed-back data exchange between the sub-channel module and the main 3D neutron kinetics/T-H system code, which already uses one level of coupling scheme between 3D neutronics and core thermal-hydraulics models. The major accomplishment of the thesis is the development of an integrated PWR safety analysis methodology with locally refined safety evaluations. This involved introduction of an improved method capable of efficiently restoring the fine pin-power distribution with a high degree of accuracy. In order to apply the methodology to evaluate the safety margins on a pin level, a refined on-line hot channel model was developed accounting for the cross-flow effects. Finally, this methodology was applied to best estimate safety analysis to more accurately calculate the thermal safety margins occurring during a design basis accident in PWR.

  16. Incorporation of phenomenological uncertainties in probabilistic safety analysis - application to LMFBR core disruptive accident energetics

    SciTech Connect

    Najafi, B; Theofanous, T G; Rumble, E T; Atefi, B

    1984-08-01

    This report describes a method for quantifying frequency and consequence uncertainty distribution associated with core disruptive accidents (CDAs). The method was developed to estimate the frequency and magnitude of energy impacting the reactor vessel head of the Clinch River Breeder Plant (CRBRP) given the occurrence of hypothetical CDAs. The methodology is illustrated using the CRBR example.

  17. Use of inelastic analysis to determine the response of packages to puncture accidents

    SciTech Connect

    Ammerman, D.J.; Ludwigsen, J.S.

    1996-08-01

    The accurate analytical determination of the response of radioactive material transportation packages to the hypothetical puncture accident requires inelastic analysis techniques. Use of this improved analysis method recudes the reliance on empirical and approximate methods to determine the safety for puncture accidents. This paper will discuss how inelastic analysis techniques can be used to determine the stresses, strains and deformations resulting from puncture accidents for thin skin materials with different backing materials. A method will be discussed to assure safety for all of these types of packages.

  18. Site-specific meteorology identification for DOE facility accident analysis

    SciTech Connect

    Rabin, S.B.

    1995-09-01

    Currently, chemical dispersion calculations performed for safety analysis of DOE facilities assume a Pasquill D-Stability Class with a 4.5 m/s windspeed. These meteorological conditions are assumed to conservatively address the source term generation mechanism as well as the dispersion mechanism thereby resulting in a net conservative downwind consequence. While choosing this Stability Class / Windspeed combination may result in an overall conservative consequence, the level of conservative can not be quantified. The intent of this paper is to document a methodology which incorporates site-specific meteorology to determine a quantifiable consequence of a chemical release. A five-year meteorological database, appropriate for the facility location, is utilized for these chemical consequence calculations, and is consistent with the approach used for radiological releases. The hourly averages of meteorological conditions have been binned into 21 groups for the chemical consequence calculations. These 21 cases each have a probability of occurrence based on the number of times each case has occurred over the five year sampling period. A code has been developed which automates the running of all the cases with a commercially available air modeling code. The 21 cases are sorted by concentration. A concentration may be selected by the user for a quantified level of conservatism. The methodology presented is intended to improve the technical accuracy and defensability of Chemical Source Term / Dispersion Safety Analysis work. The result improves the quality of safety analyses products without significantly increasing the cost.

  19. Analysis of Loss-of-Coolant Accidents in the NBSR

    SciTech Connect

    Baek J. S.; Cheng L.; Diamond, D.

    2014-05-23

    This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

  20. Expert opinion in risk analysis; The NUREG-1150 methodology

    Microsoft Academic Search

    S. C. Hora; R. L. Iman

    1989-01-01

    Risk analysis of nuclear power generation often requires the use of expert opinion to provide probabilistic inputs where other sources of information are unavailable or are not cost effective. In the Reactor Rise Reference Document (NUREG-1150), a methodology for the collection of expert opinion was developed. The resulting methodology presented by the author involves a ten-step process: selection of experts,

  1. Social Network Analysis in Human Resource Development: A New Methodology

    Microsoft Academic Search

    John-Paul Hatala

    2006-01-01

    Through an exhaustive review of the literature, this article looks at the applicability of social network analysis (SNA) in the field of humanresource development. The literature review revealed that a number of disciplines have adopted this unique methodology, which has assisted in the development of theory. SNA is a methodology for examining the structure among actors, groups, and organizations and

  2. A methodology for human factors analysis in office automation systems

    Microsoft Academic Search

    ALEXANDER NIKOV; GIACINTO MATARAZZO; ANTONINO ORLANDO

    1993-01-01

    A methodology for computer-aided human factors analysis in office automation system (OAS) design and implementation process has been developed. It incorporates a fuzzy knowledge-based evaluation mechanism, which is employed to aggregate data measured in scales of different type. The methodology has a high degree of flexibility, which allows it to be adjusted to the individual client situation. A case study

  3. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  4. Action Plan for updated Chapter 15 Accident Analysis in the SRS Production Reactor SAR

    SciTech Connect

    Hightower, N.T. III; Burnett, T.W.

    1989-11-15

    This report describes the Action Plan for the upgrade of the Chapter 15 Accident Analysis in the SRS Production Reactor SAR required for K-Restart. This Action Plan will be updated periodically to reflect task accomplishments and issue resolutions.

  5. Analysis of fission product revaporization in a BWR reactor cooling system during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    A preliminary analysis of the re-evaporization of volatile fission product from a boiling water reactor (BWR) cooling system following a core meltdown accident in which the core debris penetrates the reactor vessel has been performed. The BWR analyzed has a Mark I containment and the accident sequence was a station blackout transient. This work was performed as part of the phenomenological uncertainty study of the Quantification and Uncertainty Analysis of Source Terms for Severe Accidents program at Brookhaven National Laboratory. Fission product re-evaporization was identified as one of the important issues in the Reactor Risk Reference Document.

  6. Protein Structure Prediction by Comparative Modeling: An Analysis of Methodology

    E-print Network

    Protein Structure Prediction by Comparative Modeling: An Analysis of Methodology Jennifer Wang, Biochemistry 218 Submitted December 11, 2009 1. Introduction Protein structure determination has become an important area of research in molecular biology and structural genomics. Understanding the tertiary

  7. Protein MAS NMR methodology and structural analysis of protein assemblies

    E-print Network

    Bayro, Marvin J

    2010-01-01

    Methodological developments and applications of solid-state magic-angle spinning nuclear magnetic resonance (MAS NMR) spectroscopy, with particular emphasis on the analysis of protein structure, are described in this thesis. ...

  8. An accident analysis of the physical plant of the Agricultural and Mechanical College of Texas

    E-print Network

    Allen, Gary James

    1963-01-01

    AN ACCIDENT ANALYSIS OF THE PHYSICAL PLANT OF 'THE AGRICULTURAL AN9 ~CHANICAL COLLEGE OF TEXAS A Thesis Sy Gary James Allen Approved as to style and content by: airman of Comrntttee) H of epartxnent or Stu e t A vis r) August 1963... AN ACCIDENT ANALYSIS OF THE PHYSICAL PLANT DEPARTMENT OF THE AGRICULTURAL AND MECHANICAL COLLEGE OF TEXAS A Thesis Gary James Allen Submitted to the Graduate School of the Agricultural and Mechanical College of Teaas in partial fulfillment...

  9. Preliminary Assessment of ICRP Dose Conversion Factor Recommendations for Accident Analysis Applications

    SciTech Connect

    Vincent, A.M.

    2002-03-13

    Accident analysis for U.S. Department of Energy (DOE) nuclear facilities is an integral part of the overall safety basis developed by the contractor to demonstrate facility operation can be conducted safely. An appropriate documented safety analysis for a facility discusses accident phenomenology, quantifies source terms arising from postulated process upset conditions, and applies a standardized, internationally-recognized database of dose conversion factors (DCFs) to evaluate radiological conditions to offsite receptors.

  10. Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor

    SciTech Connect

    Yulianti, Yanti [Dept. of Physics, Universitas Lampung (UNILA), Jl. Sumantri Brojonegor No.1 Bandar Lampung (Indonesia); Dept. of Physics, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung (Indonesia); Su'ud, Zaki; Waris, Abdul; Khotimah, S. N. [Dept. of Physics, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung (Indonesia); Shafii, M. Ali [Dept. of Physics, Institut Teknologi Bandung (ITB), Jl. Ganesha 10 Bandung (Indonesia); Dept. of Physics, Universitas Andalas (UNAND), Kampus Limau Manis, Padang, Sumatera Barat (Indonesia)

    2010-12-23

    The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

  11. Fast Transient And Spatially Non-Homogenous Accident Analysis Of Two-Dimensional Cylindrical Nuclear Reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Waris, Abdul; Khotimah, S. N.; Shafii, M. Ali

    2010-12-01

    The research about fast transient and spatially non-homogenous nuclear reactor accident analysis of two-dimensional nuclear reactor has been done. This research is about prediction of reactor behavior is during accident. In the present study, space-time diffusion equation is solved by using direct methods which consider spatial factor in detail during nuclear reactor accident simulation. Set of equations that obtained from full implicit finite-difference discretization method is solved by using iterative methods ADI (Alternating Direct Implicit). The indication of accident is decreasing macroscopic absorption cross-section that results large external reactivity. The power reactor has a peak value before reactor has new balance condition. Changing of temperature reactor produce a negative Doppler feedback reactivity. The reactivity will reduce excess positive reactivity. Temperature reactor during accident is still in below fuel melting point which is in secure condition.

  12. Pedestrian accident analysis with a silicone dummy block.

    PubMed

    Lee, Youngnae; Park, Sungji; Yoon, Seokhyun; Kong, Youngsu; Goh, Jae-Mo

    2012-07-10

    When a car is parked in an inclined plane in a parking lot, the car can roll down the slope and cause a pedestrian accident, even when the angle of inclination is small. A rolling car on a gentle slope seems to be easily halted by human power to prevent damage to the car or a possible accident. However, even if the car rolls down very slowly, it can cause severe injuries to a pedestrian, especially when the pedestrian cannot avoid the rolling car. In an accident case that happened in our province, a pedestrian was injured by a rolling car, which had been parked on a slope the night before. The accident occurred in the parking lot of an apartment complex. The parking lot seemed almost flat with the naked eye. We conducted a rolling test with the accident vehicle at the site. The car was made to roll down the slope by purely gravitational pull and was made to collide with the silicone block leaning against the retaining wall. Silicone has characteristics similar to those of a human body, especially with respect to stiffness. In the experiment, we measured the shock power quantitatively. The results showed that a rolling car could severely damage the chest of a pedestrian, even if it moved very slowly. PMID:22455985

  13. [Paragliding accidents--a prospective analysis in Swiss mountain regions].

    PubMed

    Lautenschlager, S; Karli, U; Matter, P

    1993-01-01

    During the period from 1.1 to 31.12.90, 86 injuries associated with paragliding were analysed in a prospective study in 12 different Swiss hospitals with reference to causes, patterns, and frequencies. Spine injuries (36%) and lesions of the lower extremities (35%) were diagnosed most frequently. Surprisingly no neurological complications occurred, which is possibly explained by the solitary axial trauma. In 15 cases very severe malleolar fractures required surgical intervention. One accident was fatal due to a lung rupture. 60% of all accidents happened during the landing phase, 26% at launching and 14% at flight. Half of the pilots were affected in their primary training course. Most accidents were due to an in-flight error of judgement, such as incorrect estimation of wind conditions and a choice of unfavourable landing sites. In contrast to early reports of hang-gliding injuries, only one accident was due to an equipment failure, namely a ruptured steering line. In more than a third of all accidents, the used paraglider was not in correct correlation with the pilot's weight and experience. Inspired by the desire for a long flight, gliders of too large surface-areas were often used, leading to a more unstable flight. To reduce the frequency of paragliding injuries, an accurate choice of equipment and increased attention to environmental factors is mandatory. Furthermore education-programs should focus more on intensifying the pilot's mental and practical skills. PMID:8123342

  14. ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS

    SciTech Connect

    WILLIAMS, J.C.

    2003-11-15

    This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

  15. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety

    Microsoft Academic Search

    Juan Carbajo; Hae-Yong Jeong; Roald Wigeland; Michael Corradini; Rodney Cannon Schmidt; Justin Thomas; Tom Wei; Tanju Sofu; Hans Ludewig; Yoshiharu Tobita; Hiroyuki Ohshima; Frederic Serre

    2011-01-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University

  16. Analysis of traffic accidents on rural highways using Latent Class Clustering and Bayesian Networks.

    PubMed

    de Oña, Juan; López, Griselda; Mujalli, Randa; Calvo, Francisco J

    2013-03-01

    One of the principal objectives of traffic accident analyses is to identify key factors that affect the severity of an accident. However, with the presence of heterogeneity in the raw data used, the analysis of traffic accidents becomes difficult. In this paper, Latent Class Cluster (LCC) is used as a preliminary tool for segmentation of 3229 accidents on rural highways in Granada (Spain) between 2005 and 2008. Next, Bayesian Networks (BNs) are used to identify the main factors involved in accident severity for both, the entire database (EDB) and the clusters previously obtained by LCC. The results of these cluster-based analyses are compared with the results of a full-data analysis. The results show that the combined use of both techniques is very interesting as it reveals further information that would not have been obtained without prior segmentation of the data. BN inference is used to obtain the variables that best identify accidents with killed or seriously injured. Accident type and sight distance have been identify in all the cases analysed; other variables such as time, occupant involved or age are identified in EDB and only in one cluster; whereas variables vehicles involved, number of injuries, atmospheric factors, pavement markings and pavement width are identified only in one cluster. PMID:23182777

  17. Sensitivity analysis of radionuclides atmospheric dispersion following the Fukushima accident

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Korsakissok, Irène; Mallet, Vivien

    2014-05-01

    Atmospheric dispersion models are used in response to accidental releases with two purposes: - minimising the population exposure during the accident; - complementing field measurements for the assessment of short and long term environmental and sanitary impacts. The predictions of these models are subject to considerable uncertainties of various origins. Notably, input data, such as meteorological fields or estimations of emitted quantities as function of time, are highly uncertain. The case studied here is the atmospheric release of radionuclides following the Fukushima Daiichi disaster. The model used in this study is Polyphemus/Polair3D, from which derives IRSN's operational long distance atmospheric dispersion model ldX. A sensitivity analysis was conducted in order to estimate the relative importance of a set of identified uncertainty sources. The complexity of this task was increased by four characteristics shared by most environmental models: - high dimensional inputs; - correlated inputs or inputs with complex structures; - high dimensional output; - multiplicity of purposes that require sophisticated and non-systematic post-processing of the output. The sensitivities of a set of outputs were estimated with the Morris screening method. The input ranking was highly dependent on the considered output. Yet, a few variables, such as horizontal diffusion coefficient or clouds thickness, were found to have a weak influence on most of them and could be discarded from further studies. The sensitivity analysis procedure was also applied to indicators of the model performance computed on a set of gamma dose rates observations. This original approach is of particular interest since observations could be used later to calibrate the input variables probability distributions. Indeed, only the variables that are influential on performance scores are likely to allow for calibration. An indicator based on emission peaks time matching was elaborated in order to complement classical statistical scores which were dominated by deposit dose rates and almost insensitive to lower atmosphere dose rates. The substantial sensitivity of these performance indicators is auspicious for future calibration attempts and indicates that the simple perturbations used here may be sufficient to represent an essential part of the overall uncertainty.

  18. Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots

    Microsoft Academic Search

    Steve Jarvis; Don Harris

    2010-01-01

    Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from

  19. Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots

    Microsoft Academic Search

    Steve Jarvis; Don Harris

    2009-01-01

    Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from

  20. Accident information needs

    SciTech Connect

    Hanson, D.J.; Arcieri, W.C.; Ward, L.W.

    1992-12-31

    A Five-step methodology has been developed to evaluate information needs for nuclear power plants under accident conditions and the availability of plant instrumentation during severe accidents. Step 1 examines the credible accidents and their relationships to plant safety functions. Step 2 determines the information personnel involved in accident management will need to understand plant behavior. Step 3 determines the capability of the instrumentation to function properly under severe accident conditions. Step 4 determines the conditions expected during the identified severe accidents. Step 5 compares the instrument capabilities and the severe accident conditions to evaluate the availability of the instrumentation to supply needed plant information.

  1. Accident information needs

    SciTech Connect

    Hanson, D.J.; Arcieri, W.C.; Ward, L.W.

    1992-01-01

    A Five-step methodology has been developed to evaluate information needs for nuclear power plants under accident conditions and the availability of plant instrumentation during severe accidents. Step 1 examines the credible accidents and their relationships to plant safety functions. Step 2 determines the information personnel involved in accident management will need to understand plant behavior. Step 3 determines the capability of the instrumentation to function properly under severe accident conditions. Step 4 determines the conditions expected during the identified severe accidents. Step 5 compares the instrument capabilities and the severe accident conditions to evaluate the availability of the instrumentation to supply needed plant information.

  2. Analysis of electrical accidents in UK domestic properties

    Microsoft Academic Search

    M. Barrett; K. OConnell; Cma Sung; G. Stokes

    2010-01-01

    Electricity is one of the most convenient forms of energy that is used in every building today. However, it causes a number of fatalities every year and has the potential to cause harm to anyone exposed to it. This article, investigates the cause and effects of electrical accidents in domestic properties over a 3-year period (2000—2002) in the UK based

  3. An Analysis of Tokaimura Nuclear Criticality Accident: A systems approach

    Microsoft Academic Search

    Shigehisa Tsuchiya; A. Tanabe; T. Narushima; K. Ito; K. Yamazaki

    Except for what are sometimes called 'Act of God', any problems arising at a nuclear plant originate in some way in human error (1). The IAEA Report also concluded that the accident at the JCO nuclear fuel processing facility at Tokaimura seemed to have resulted pr imarily from human error and serious breaches of safety principles. However, u nless there

  4. INVITED EDITORIAL: Uncertainties in probabilistic nuclear accident consequence analysis

    Microsoft Academic Search

    M. P. Little

    1998-01-01

    National Radiological Protection Board, Chilton, Didcot, Oxon OX11 0RQ, UK For all nuclear installations there is a small probability of an accident occurring which could lead to a release of radionuclides into the environment, despite the design intent to build the nuclear plant in such a way as to reduce that possibility to a low level. It is therefore important

  5. FREADM1; fast reactor core accident analysis. [GE635; FORTRAN IV

    Microsoft Academic Search

    T. B. Fowler; M. L. Tobias; J. N. Fox; B. E. Lawler; J. U. Koppel; J. R. Triplett; L. L. Lynn; L. A. Waldman; I. Goldberg; P. Greebler; M. D. Kelley; R. A. Davis; C. E. Keck; J. A. Redfield; W. G. Meinhardt

    2008-01-01

    FREADM1 is a fast reactor, multichannel, accident analysis program designed to efficiently simulate a reactor transient from initiation to the point of core disassembly. Models are included for nuclear kinetics (point model), core thermo-hydraulics, voiding, fuel redistribution, failure propagation, programmed reactivity insertion, and the dynamics of primary-system coolant flow. A broad range of assumed accident initiating and propagating activities may

  6. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    SciTech Connect

    Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

    2012-09-30

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

  7. RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

  8. Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory

    SciTech Connect

    Kim, S.H.; Taleyarkhan, R.P.

    1994-01-01

    A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

  9. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    SciTech Connect

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and at cask-loading-specific conditions could be performed to demonstrate that release is within the allowable leak rates of the cask.

  10. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H., III; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  11. Integrated Analysis of Mechanical and Thermal Hydraulic Behavior of Graphite Stack in Channel-Type Reactors in Case of a Fuel Channel Rupture Accident

    SciTech Connect

    Soloviev, Sergei L. [MINATOM, Moscow (Russian Federation); Gabaraev, Boris A.; Novoselsky, Oleg Yu.; Filinov, Vladimir N. [Research and Development Institute of Power Engineering, M. Krasnoselskaya ul., build. 2/8, 107140 Moscow (Russian Federation); Parafilo, Leonid M.; Kruchkov, Dmitry V. [Institute of Physics and Power Engineering, 1 Bondarenko sq., RU-249020 Obninsk Kaluga Region (Russian Federation); Melikhov, Oleg I. [Electrogorsk Research and Engineering Center, Saint Constantine st., 6, Electrogorsk, Moscow Region, 142530 (Russian Federation)

    2002-07-01

    The paper discusses the methodology and a computational exercise analyzing the processes taking place in the graphite stack of an RBMK reactor in case of a pressure tube rupture caused by overheating. The methodology of the computational analysis is implemented in integrated code U{sub S}TACK which models thermal-hydraulic and mechanical processes in the stack with a varying geometry, coupled with the processes going on in the circulation loop and accident localization (confinement) system. Coolant parameters, cladding and pressure tube temperatures, pressure tube ballooning and rupture, coolant outflow are calculated for a given accident scenario. Fluid parameters, movement of graphite blocks and adjacent pressure tubes bending after the tube rupture are calculated for the whole volume of the core. Calculations also cover additional loads on adjacent fuel channels in the rupture zone, reactor shell, upper and lower plates. Impossibility of an induced pressure tube rupture is confirmed. (authors)

  12. 3D analysis of the reactivity insertion accident in VVER-1000

    SciTech Connect

    Abdullayev, A. M.; Zhukov, A. I.; Slyeptsov, S. M. [NSC Kharkov Inst. for Physics and Technology, 1, Akademicheskaya Str., Kharkov 61108 (Ukraine)

    2012-07-01

    Fuel parameters such as peak enthalpy and temperature during rod ejection accident are calculated. The calculations are performed by 3D neutron kinetics code NESTLE and 3D thermal-hydraulic code VIPRE-W. Both hot zero power and hot full power cases were studied for an equilibrium cycle with Westinghouse hex fuel in VVER-1000. It is shown that the use of 3D methodology can significantly increase safety margins for current criteria and met future criteria. (authors)

  13. THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT

    SciTech Connect

    Gupta, N.

    2011-02-14

    Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

  14. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    SciTech Connect

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N. [Atomic Energy of Canada Limited, Ontario (Canada)

    2002-07-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX{sup R} fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  15. Plasma RNA integrity analysis: methodology and validation.

    PubMed

    Wong, Blenda C K; Lo, Y M Dennis

    2006-09-01

    The detection of cell-free RNA in plasma and serum of human subjects has found increasing applications in the field of medical diagnostics. However, many questions regarding the biology of circulating RNA remain to be addressed. One issue concerns the molecular nature of these circulating RNA species. We have recently developed a simple and quantitative method to investigate the integrity of plasma RNA. Our results have suggested that cell-free RNA in plasma is generally present as fragmented molecules instead of intact transcripts, with a predominance of 5' fragments. In this article, we summarize the basic principles in the experimental design for plasma RNA integrity analysis and highlight some of the important technical considerations for this type of investigation. PMID:17108208

  16. Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

    1996-12-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

  17. Combined spatial-temporal analysis of malformation rates in Bavaria after the Chernobyl accident

    Microsoft Academic Search

    Helmut Küchenhoff; Astrid Engelhardt; Alfred Körblein

    Malformation rates in the German state of Bavaria, as a whole, did not increase in 1987, the year following the Chernobyl accident. Also an analysis of the monthly data does not show any association between radiation exposure and malformation rates seven month later. But in a detailed analysis on the level of districts taking the spatial structure into account we

  18. Advanced Power Plant Development and Analysis Methodologies

    SciTech Connect

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  19. Indonesian railway accidents--utilizing Human Factors Analysis and Classification System in determining potential contributing factors.

    PubMed

    Iridiastadi, Hardianto; Ikatrinasari, Zulfa Fitri

    2012-01-01

    The prevalence of Indonesian railway accidents has not been declining, with hundreds of fatalities reported in the past decade. As an effort to help the National Transportation Safety Committee (NTSC), this study was conducted that aimed at understanding factors that might have contributed to the accidents. Human Factors Analysis and Classification System (HFACS) was utilized for this purpose. A total of nine accident reports (provided by the Indonesian NTSC) involving fatalities were studied using the technique. Results of this study indicated 72 factors that were closely related to the accidents. Of these, roughly 22% were considered as operator acts while about 39% were related to preconditions for operator acts. Supervisory represented 14% of the factors, and the remaining (about 25%) were associated with organizational factors. It was concluded that, while train drivers indeed played an important role in the accidents, interventions solely directed toward train drivers may not be adequate. A more comprehensive approach in minimizing the accidents should be conducted that addresses all the four aspects of HFACS. PMID:22317372

  20. A User Friendly Phase Detection Methodology for HPC Systems' Analysis

    E-print Network

    Paris-Sud XI, Université de

    A User Friendly Phase Detection Methodology for HPC Systems' Analysis Ghislain Landry Tsafack, stolf, dacosta}@irit.fr Abstract--A wide array of today's high performance com- puting (HPC for detecting phases in the behaviour of a HPC system and deter- mining execution points that correspond

  1. Development of The Purdue Cognitive Job Analysis Methodology

    Microsoft Academic Search

    June Wei; Gavriel Salvendy

    2000-01-01

    The objective of this article is to develop a cognitive job and task analysis methodology that not only analyzes jobs and tasks, but also provides a mechanism for improving cognitive job and task perfor- mance. Two phases were used to achieve this objective. The 1st phase developed a human-centered cognitive performance (HCCP) model based on human information processing. To quantitatively

  2. Preliminary accident analysis to support a passive depressurization systems design

    SciTech Connect

    Lenti, R.; Mansani, L.; Saiu, G. [Ansaldo, Genoa (Italy). Nuclear Div.

    1996-05-01

    The new generation of evolutionary nuclear power plants, e.g., the Westinghouse AP600 and the General Electric simplified boiling water reactor, relies on a full reactor coolant system (RCS) depressurization to allow gravity injection from an in-containment tank and thereby assure long-term core cooling. Studies performed to support the licensing process and design of both evolutionary and innovative reactors have shown that cold water injection may, under particular plant conditions, induce a large plant depressurization. Preliminary studies have been performed to support the design of a passive injection and depressurization system (PIDS) based on the idea of depressurizing the RCS by mixing cold water with the RCS hot water and inducing steam condensation in the primary system. The analyses, performed with the RELAP5/MOD3 computer code, show the response of a typical midsize pressurized water reactor plant [two loops, 600 MW (electric)] equipped with the PIDS. Different RCS injection locations including pressurizer, vessel upper head, and hot leg, and actuation at different residual reactor coolant masses have been investigated. The PIDS performance has also been verified against the following reference severe accident scenarios: (a) complete station blackout event, and (b) a small-break loss-of-coolant accident and concomitant station blackout event.

  3. MELCOR code analysis of a severe accident LOCA at Peach Bottom Plant

    SciTech Connect

    Carbajo, J.J. (Oak Ridge National Lab., TN (United States))

    1993-01-01

    A design-basis loss-of-coolant accident (LOCA) concurrent with complete loss of the emergency core cooling systems (ECCSs) has been analyzed for the Peach Bottom atomic station unit 2 using the MELCOR code, version 1.8.1. The purpose of this analysis is to calculate best-estimate times for the important events of this accident sequence and best-estimate source terms. Calculated pressures and temperatures at the beginning of the transient have been compared to results from the Peach Bottom final safety analysis report (FSAR). MELCOR-calculated source terms have been compared to source terms reported in the NUREG-1465 draft.

  4. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    USGS Publications Warehouse

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  5. Expert opinion in risk analysis; The NUREG-1150 methodology

    SciTech Connect

    Hora, S.C.; Iman, R.L. (Sandia National Labs., Albuquerque, NM (USA))

    1989-08-01

    Risk analysis of nuclear power generation often requires the use of expert opinion to provide probabilistic inputs where other sources of information are unavailable or are not cost effective. In the Reactor Rise Reference Document (NUREG-1150), a methodology for the collection of expert opinion was developed. The resulting methodology presented by the author involves a ten-step process: selection of experts, selection of issues, preparation of issue statements, elicitation training, preparation of expert analyses by panel members, discussion of analyses, elicitation, recomposition and aggregation, and review by the panel members. These steps were implemented in a multiple meeting format that brought together experts from a variety of work places.

  6. Accident analysis for transuranic waste management alternatives in the U.S. Department of Energy waste management program

    SciTech Connect

    Nabelssi, B.; Mueller, C.; Roglans-Ribas, J.; Folga, S.; Tompkins, M. [Argonne National Lab., IL (United States); Jackson, R. [Scientific Applications International Corp., Golden, CO (United States)

    1995-03-01

    Preliminary accident analyses and radiological source term evaluations have been conducted for transuranic waste (TRUW) as part of the US Department of Energy (DOE) effort to manage storage, treatment, and disposal of radioactive wastes at its various sites. The approach to assessing radiological releases from facility accidents was developed in support of the Office of Environmental Management Programmatic Environmental Impact Statement (EM PEIS). The methodology developed in this work is in accordance with the latest DOE guidelines, which consider the spectrum of possible accident scenarios in the implementation of various actions evaluated in an EIS. The radiological releases from potential risk-dominant accidents in storage and treatment facilities considered in the EM PEIS TRUW alternatives are described in this paper. The results show that significant releases can be predicted for only the most severe and extremely improbable accidents sequences.

  7. Reducing commercial vehicle accidents through accident databases

    Microsoft Academic Search

    Will Murray; Tony Whiteing

    1995-01-01

    Commercial vehicle accidents impose very significant costs on industry and society but for a variety of reasons the full costs are often poorly understood. Advocates that vehicle operators should undertake a full and systematic analysis of accident levels, causes and costs. Introduces the CCSM model of vehicle accident reduction. By undertaking analysis based on this approach, most vehicle operators should

  8. Methodological Variability Using Electronic Nose Technology For Headspace Analysis

    SciTech Connect

    Knobloch, Henri; Turner, Claire; Spooner, Andrew [Cranfield University, Cranfield Health, Silsoe (United Kingdom); Chambers, Mark [Veterinary Laboratories Agency (VLA Weybridge) (United Kingdom)

    2009-05-23

    Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

  9. Risk analysis of releases from accidents during mid-loop operation at Surry

    SciTech Connect

    Jo, J.; Lin, C.C.; Nimnual, S.; Mubayi, V.; Neymotin, L.

    1992-11-01

    Studies and operating experience suggest that the risk of severe accidents during low power operation and/or shutdown (LP/S) conditions could be a significant fraction of the risk at full power operation. Two studies have begun at the Nuclear Regulatory Commission (NRC) to evaluate the severe accident progression from a risk perspective during these conditions: One at the Brookhaven National Laboratory for the Surry plant, a pressurized water reactor (PWR), and the other at the Sandia National Laboratories for the Grand Gulf plant, a boiling water reactor (BWR). Each of the studies consists of three linked, but distinct, components: a Level I probabilistic risk analysis (PRA) of the initiating events, systems analysis, and accident sequences leading to core damage; a Level 2/3 analysis of accident progression, fuel damage, releases, containment performance, source term and consequences-off-site and on-site; and a detailed Human Reliability Analysis (HRA) of actions relevant to plant conditions during LP/S operations. This paper summarizes the approach taken for the Level 2/3 analysis at Surry and provides preliminary results on the risk of releases and consequences for one plant operating state, mid-loop operation, during shutdown.

  10. Two methodologies for optical analysis of contaminated engine lubricants

    NASA Astrophysics Data System (ADS)

    Aghayan, Hamid; Bordatchev, Evgueni; Yang, Jun

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant-object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function, coherence function, etc) are used for the analysis of combined object-lubricant images. Both proposed methodologies utilize the comparison of measured parameters and calculated object shape-based and statistical characteristics for fresh and contaminated lubricants. Developed methodologies are verified experimentally showing an ability to distinguish lubricant with 0%, 3%, 7% and 10% water and coolant contamination. This proves the potential applicability of the developed methodologies for on-line measurement, monitoring and control of the engine lubricant condition.

  11. Intelligent signal analysis methodologies for nuclear detection, identification and attribution

    NASA Astrophysics Data System (ADS)

    Alamaniotis, Miltiadis

    Detection and identification of special nuclear materials can be fully performed with a radiation detector-spectrometer. Due to several physical and computational limitations, development of fast and accurate radioisotope identifier (RIID) algorithms is essential for automated radioactive source detection and characterization. The challenge is to identify individual isotope signatures embedded in spectral signature aggregation. In addition, background and isotope spectra overlap to further complicate the signal analysis. These concerns are addressed, in this thesis, through a set of intelligent methodologies recognizing signature spectra, background spectrum and, subsequently, identifying radionuclides. Initially, a method for detection and extraction of signature patterns is accomplished by means of fuzzy logic. The fuzzy logic methodology is applied on three types of radiation signal processing applications, where it exhibits high positive detection, low false alarm rate and very short execution time, while outperforming the maximum likelihood fitting approach. In addition, an innovative Pareto optimal multiobjective fitting of gamma ray spectra using evolutionary computing is presented. The methodology exhibits perfect identification while performs better than single objective fitting. Lastly, an innovative kernel based machine learning methodology was developed for estimating natural background spectrum in gamma ray spectra. The novelty of the methodology lies in the fact that it implements a data based approach and does not require any explicit physics modeling. Results show that kernel based method adequately estimates the gamma background, but algorithm's performance exhibits a strong dependence on the selected kernel.

  12. Risk Analysis Methodology for Kistler's K-1 Reusable Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Birkeland, Paul W.

    2002-01-01

    Missile risk analysis methodologies were originally developed in the 1940s as the military experimented with intercontinental ballistic missile (ICBM) technology. As the range of these missiles increased, it became apparent that some means of assessing the risk posed to neighboring populations was necessary to gauge the relative safety of a given test. There were many unknowns at the time, and technology was unpredictable at best. Risk analysis itself was in its infancy. Uncertainties in technology and methodology led to an ongoing bias toward conservative assumptions to adequately bound the problem. This methodology ultimately became the Casualty Expectation Analysis that is used to license Expendable Launch Vehicles (ELVs). A different risk analysis approach was adopted by the commercial aviation industry in the 1950s. At the time, commercial aviation technology was more firmly in hand than ICBM technology. Consequently commercial aviation risk analysis focused more closely on the hardware characteristics. Over the years, this approach has enabled the advantages of technological and safety advances in commercial aviation hardware to manifest themselves in greater capabilities and opportunities. The Boeing 777, for example, received approval for trans-oceanic operations "out of the box," where all previous aircraft were required, at the very least, to demonstrate operations over thousands of hours before being granted such approval. This "out of the box" approval is likely to become standard for all subsequent designs. In short, the commercial aircraft approach to risk analysis created a more flexible environment for industry evolution and growth. In contrast, the continued use of the Casualty Expectation Analysis by the launch industry is likely to hinder industry maturation. It likely will cause any safety and reliability gains incorporated into RLV design to be masked by the conservative assumptions made to "bound the problem." Consequently, for the launch industry to mature, a different approach to RLV risk analysis must be adopted. This paper will present such a methodology for Kistler's K-1 reusable launch vehicle. This paper will develop an approach to risk analysis that represents an amalgamation of the two approaches. This methodology provides flexibility to the launch industry that will enable the regulatory environment to more efficiently accommodate new technologies and approaches. It will also present a derivation of an appropriate assessment threshold that is the equivalent of the currently accepted 30-in-a-million casualty expectation.

  13. Comparative analysis of EPA cost-benefit methodologies

    SciTech Connect

    Poch, L.; Gillette, J.; Veil, J.

    1998-05-01

    In recent years, reforming the regulatory process has received much attention from diverse groups such as environmentalists, the government, and industry. A cost-benefit analysis can be a useful way to organize and compare the favorable and unfavorable impacts a proposed action night have on society. Since 1981, two Executive Orders have required the U.S. Environmental Protection Agency (EPA) and other regulatory agencies to perform cost-benefit analyses in support of regulatory decision making. At the EPA, a cost-benefit analysis is published as a document called a regulatory impact analysis (RIA). This report reviews cost-benefit methodologies used by three EPA program offices: Office of Air and Radiation, Office of Solid Waste, and Office of Water. These offices were chosen because they promulgate regulations that affect the policies of this study`s sponsor (U.S. Department of Energy, Office of Fossil Energy) and the technologies it uses. The study was conducted by reviewing 11 RIAs recently published by the three offices and by interviewing staff members in the offices. To draw conclusions about the EPA cost-benefit methodologies, their components were compared with those of a standard methodology (i.e., those that should be included in a comprehensive cost-benefit methodology). This study focused on the consistency of the approaches as well as their strengths and weaknesses, since differences in the cost-benefit methodologies themselves or in their application can cause confusion and preclude consistent comparison of regulations both within and among program offices.

  14. Fluid-structure interaction analysis of a hypothetical core disruptive accident in LMFBRs

    Microsoft Academic Search

    Chuang Liu; Xiong Zhang; Ming-Wan Lu

    2005-01-01

    To ensure safety, it is necessary to assess the integrity of a reactor vessel of liquid-metal fast breeder reactor (LMFBR) under HCDA. Several important problems for a fluid-structural interaction analysis of HCDA are discussed in the present paper. Various loading models of hypothetical core disruptive accident (HCDA) are compared and the polytropic processes of idea gas (PPIG) law is recommended.

  15. Analysis of dental materials as an aid to identification in aircraft accidents

    SciTech Connect

    Wilson, G.S.; Cruickshanks-Boyd, D.W.

    1982-04-01

    The failure to achieve positive identification of aircrew following an aircraft accident need not prevent a full autopsy and toxicological examination to ascertain possible medical factors involved in the accident. Energy-dispersive electron microprobe analysis provides morphological, qualitative, and accurate quantitative analysis of the composition of dental amalgam. Wet chemical analysis can be used to determine the elemental composition of crowns, bridges and partial dentures. Unfilled resin can be analyzed by infrared spectroscopy. Detailed analysis of filled composite restorative resins has not yet been achieved in the as-set condition to permit discrimination between manufacturers' products. Future work will involve filler studies and pyrolysis of the composite resins by thermogravimetric analysis to determine percentage weight loss when the sample examined is subjected to a controlled heating regime. With these available techniques, corroborative evidence achieved from the scientific study of materials can augment standard forensic dental results to obtain a positive identification.

  16. Analysis of Reactivity Induced Accident for Control Rods Ejection with Loss of Cooling

    E-print Network

    Saad, Hend Mohammed El Sayed; Wahab, Moustafa Aziz Abd El

    2013-01-01

    Understanding of the time-dependent behavior of the neutron population in nuclear reactor in response to either a planned or unplanned change in the reactor conditions, is a great importance to the safe and reliable operation of the reactor. In the present work, the point kinetics equations are solved numerically using stiffness confinement method (SCM). The solution is applied to the kinetics equations in the presence of different types of reactivities and is compared with different analytical solutions. This method is also used to analyze reactivity induced accidents in two reactors. The first reactor is fueled by uranium and the second is fueled by plutonium. This analysis presents the effect of negative temperature feedback with the addition positive reactivity of control rods to overcome the occurrence of control rod ejection accident and damaging of the reactor. Both power and temperature pulse following the reactivity- initiated accidents are calculated. The results are compared with previous works and...

  17. Analysis of Reactivity Induced Accident for Control Rods Ejection with Loss of Cooling

    E-print Network

    Hend Mohammed El Sayed Saad; Hesham Mohammed Mohammed Mansour; Moustafa Aziz Abd El Wahab

    2013-06-05

    Understanding of the time-dependent behavior of the neutron population in nuclear reactor in response to either a planned or unplanned change in the reactor conditions, is a great importance to the safe and reliable operation of the reactor. In the present work, the point kinetics equations are solved numerically using stiffness confinement method (SCM). The solution is applied to the kinetics equations in the presence of different types of reactivities and is compared with different analytical solutions. This method is also used to analyze reactivity induced accidents in two reactors. The first reactor is fueled by uranium and the second is fueled by plutonium. This analysis presents the effect of negative temperature feedback with the addition positive reactivity of control rods to overcome the occurrence of control rod ejection accident and damaging of the reactor. Both power and temperature pulse following the reactivity- initiated accidents are calculated. The results are compared with previous works and satisfactory agreement is found.

  18. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  19. Risk and protection factors in fatal accidents.

    PubMed

    Dupont, Emmanuelle; Martensen, Heike; Papadimitriou, Eleonora; Yannis, George

    2010-03-01

    This paper aims at addressing the interest and appropriateness of performing accident severity analyses that are limited to fatal accident data. Two methodological issues are specifically discussed, namely the accident-size factors (the number of vehicles in the accident and their level of occupancy) and the comparability of the baseline risk. It is argued that - although these two issues are generally at play in accident severity analyses - their effects on, e.g., the estimation of survival probability, are exacerbated if the analysis is limited to fatal accident data. As a solution, it is recommended to control for these effects by (1) including accident-size indicators in the model, (2) focusing on different sub-groups of road-users while specifying the type of opponent in the model, so as to ensure that comparable baseline risks are worked with. These recommendations are applied in order to investigate risk and protection factors of car occupants involved in fatal accidents using data from a recently set up European Fatal Accident Investigation database (Reed and Morris, 2009). The results confirm that the estimated survival probability is affected by accident-size factors and by type of opponent. The car occupants' survival chances are negatively associated with their own age and that of their vehicle. The survival chances are also lower when seatbelt is not used. Front damage, as compared to other damaged car areas, appears to be associated with increased survival probability, but mostly in the case in which the accident opponent was another car. The interest of further investigating accident-size factors and opponent effects in fatal accidents is discussed. PMID:20159090

  20. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, J. R.

    2002-02-05

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  1. How Root Cause Analysis Can Improve the Value Methodology

    SciTech Connect

    Wixson, James Robert

    2002-05-01

    Root cause analysis (RCA) is an important methodology that can be integrated with the VE Job Plan to generate superior results from the VE Methodology. The point at which RCA is most appropriate is after the function analysis and FAST Model have been built and functions for improvement have been chosen. These functions are then subjected to a simple, but, rigorous RCA to get to the root cause of their deficiencies, whether it is high cost/poor value, poor quality, or poor reliability. Once the most probable causes for these problems have been arrived at, better solutions for improvement can be developed in the creativity phase because the team better understands the problems associated with these functions.

  2. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  3. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    NASA Technical Reports Server (NTRS)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  4. Uncertainty analysis of preclosure accident doses for the Yucca Mountain repository

    SciTech Connect

    Ma, C.W.; Miller, D.D.; Zavoshy, S.J. [Bechtel National, Inc., San Francisco, CA (USA); Jardine, L.J. [Jardine (L.J.) and Associates, Livermore, CA (USA)

    1990-12-31

    This study presents a generic methodology that can be used to evaluate the uncertainty in the calculated accidental offsite doses at the Yucca Mountain repository during the preclosure period. For demonstration purposes, this methodology is applied to two specific accident scenarios: the first involves a crane dropping an open container with consolidated fuel rods, the second involves container failure during emplacement or removal operations. The uncertainties of thirteen parameters are quantified by various types of probability distributions. The Latin Hypercube Sampling method is used to evaluate the uncertainty of the offsite dose. For the crane-drop scenario with concurrent filter failure, the doses due to the release of airborne fuel particles are calculated to be 0.019, 0.32, and 2.8 rem at confidence levels of 10%, 50%, and 90%, respectively. For the container failure scenario with concurrent filter failure, the 90% confidence-level dose is 0.21 rem. 20 refs., 4 figs., 3 tabs.

  5. Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots.

    PubMed

    Jarvis, Steve; Harris, Don

    2009-08-01

    Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from 2002 to 2006 for their overall causes as well as factors specific to low hours pilots. Fifty-nine categories of pilot-related accident causation emerged, which were formed into progressively larger categories until four overall human factors groups were arrived at: 'judgement'; 'handling'; 'strategy'; 'attention'. 'Handling' accounted for a significantly higher proportion of injuries than other categories. Inexperienced pilots had considerably more accidents in all categories except 'strategy'. Approach control (path judgement, airbrake and speed handling) as well as landing flare misjudgement were chiefly responsible for the high accident rate in early solo glider pilots. PMID:19629815

  6. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES.

    SciTech Connect

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C. (BNL); GRAVES, H. (US NRC).

    2005-07-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice.

  7. Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

    2000-01-01

    The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

  8. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary environment consists of the interface between configuration and sizing analysis outputs and aerodynamic parameter computations. Uncertainties are analyzed for both simulation tools and their associated input parameters. Uncertainties are then propagated across the design environment and a robust design optimization is performed over the range of a critical input parameter. The results of this research indicate that including uncertainties into design processes may require modification of design constraints previously considered acceptable in deterministic analyses.

  9. Limitations of risk analysis in the determination of medical factors in road vehicle accidents.

    PubMed

    Spencer, Michael B; Carter, Tim; Nicholson, Anthony N

    2004-01-01

    The purpose of risk analysis in the determination of medical factors in road vehicle accidents is to evaluate the risks that are associated with different strategies for accident reduction, so that the subsequent decision making process can be based on a best assessment of the likely benefits. However, it is vital to appreciate the limitations of such an approach, especially where the conclusions depend heavily on the accuracy of the assumptions made. In this paper the assumptions used in some recent analyses concerned with incapacitation, epilepsy, hypoglycaemia and psycho-active medication are explored, and the additional information required to reduce the uncertainty in the estimation of risk indicated. The conclusions from this analysis do not invalidate the use of risk assessment, but draw attention to its limitations and show how a sensitivity analysis can help to identify those areas where more precise information is needed before such an approach can be used confidently in a policy setting. PMID:14998267

  10. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  11. Empirical Bayesian analysis of accident severity for motorcyclists in large French urban areas.

    PubMed

    de Lapparent, Matthieu

    2006-03-01

    The present article deals with individual probabilities of different levels of injury in case of a motorcycle accident. The approach uses an empirical Bayesian method based on the Multinomial-Dirichlet model, see [Leonard, T., 1977. A Bayesian approach to some Multinomial estimation and pretesting problems, J. Am. Stat. Association, 72, 869-874], to conduct an analysis of the probability distributions about the severity of accidents at the level of individuals in large and dense French urban areas during year 2003. We model accident severity using four levels of injury: material damages only, slight injury, severe injury, fatal injury. Our application shows that sociodemographic characteristics of motorcyclists and factors influencing their speed behaviors, the suddenness of their collision and the vigilance of road users play significant roles on the shapes of their probability distributions of accident severity. The computation of posterior distributions of the levels of injury for different groups of motorcyclists enables us to rank them with respect to their risk of injury using second order stochastic dominance orderings. It is found that women motorcyclists between 30 and 50 years old driving powerful motorcycles are the most exposed to risk of injury. PMID:16280119

  12. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J. [and others

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  13. Accident Analysis and Prevention 40 (2008) 12441248 Short communication

    E-print Network

    McLeod, Ian

    2008-01-01

    series analyses for traffic safety interventions A. Ian McLeoda,, E.R. Vingilisb a Department of traffic safety interventions or other policies that can affect road safety often requires the collection in a wide variety of traffic safety intervention analysis applications. Our method is illustrated

  14. Exposure rate response analysis of criticality accident dectector at Savannah River Site

    SciTech Connect

    Zino, J.F.

    1995-01-01

    This analysis investigated the exposure response behavior of gamma-ray ionization chambers used in the criticality accident systems at the Savannah River Site (SRS). The project consisted of performing exposure response measurements with a calibrated {sup 137}Cs source for benchmarking of the MCNP Monte Carlo code. MCNP was then used to extrapolate the ion chamber`s response to gamma-rays with energies outside the current domain of measured data for criticality fission sources.

  15. Development of reload safety-analysis methodology and code package: uncertainty analysis. Final report. [PWR; BWR

    SciTech Connect

    Goldstein, R.

    1982-09-01

    This report presents the development of a statistical methodology proposed for use with the Electric Power Research Institute (EPRI) Reactor Analysis Support Package (RASP). The EPRI package is an integrated design methodology for use by utilities in reload core design, licensing, and operations support. Existing EPRI codes, including some currently under development, are major elements in the reload safety analysis methodology. As envisioned, the methodology will be applicable to both Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) analyses. The current status of code development and availability has mandated that the scope of the present statistical effort be confined to a basic set of codes for the PWR analysis package. This consists of the neutronics (ARMP), systems analysis (RETRAN), and thermal-hydraulics (VIPRE) codes.

  16. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  17. Human reliability data, human error and accident models—illustration through the Three Mile Island accident analysis

    Microsoft Academic Search

    Pierre Le Bot

    2004-01-01

    Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this

  18. Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.

    SciTech Connect

    Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald (Sala & Associates); Bessette, Gregory Carl; Lipinski, Ronald J.

    2006-09-01

    The Department of Energy has assigned to Sandia National Laboratories the responsibility of producing a Safety Analysis Report (SAR) for the plutonium-dioxide fueled Multi-Mission Radioisotope Thermoelectric Generator (MMRTG) proposed to be used in the Mars Science Laboratory (MSL) mission. The National Aeronautic and Space Administration (NASA) is anticipating a launch in fall of 2009, and the SAR will play a critical role in the launch approval process. As in past safety evaluations of MMRTG missions, a wide range of potential accident conditions differing widely in probability and seventy must be considered, and the resulting risk to the public will be presented in the form of probability distribution functions of health effects in terms of latent cancer fatalities. The basic descriptions of accident cases will be provided by NASA in the MSL SAR Databook for the mission, and on the basis of these descriptions, Sandia will apply a variety of sophisticated computational simulation tools to evaluate the potential release of plutonium dioxide, its transport to human populations, and the consequent health effects. The first step in carrying out this project is to evaluate the existing computational analysis tools (computer codes) for suitability to the analysis and, when appropriate, to identify areas where modifications or improvements are warranted. The overall calculation of health risks can be divided into three levels of analysis. Level A involves detailed simulations of the interactions of the MMRTG or its components with the broad range of insults (e.g., shrapnel, blast waves, fires) posed by the various accident environments. There are a number of candidate codes for this level; they are typically high resolution computational simulation tools that capture details of each type of interaction and that can predict damage and plutonium dioxide release for a range of choices of controlling parameters. Level B utilizes these detailed results to study many thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

  19. Methodology Of Business Ecosystems Network Analysis: A Field Study In Telecom Italia

    E-print Network

    Di Pillo, Gianni

    Methodology Of Business Ecosystems Network Analysis: A Field Study In Telecom Italia Future Centre. The methodology is called methodology of business ecosystem network analysis (MOBENA). 1. Introduction Today and especially the same interests have the implicit objective of long- term sustainability of the whole community

  20. Review of accident analysis calculations, 232-Z seismic scenario

    SciTech Connect

    Ballinger, M.Y.

    1993-05-01

    The 232-Z Building houses what was previously the incinerator facility, which is no longer in service. It is constructed out of concrete blocks and is approximately 37 ft wide by 57 ft long. The building has a single story over the process areas and two stories over the service areas at the north end of the building. The respective roofs are 15 ft and 19 ft above grade and consist of concrete over a metal decking, with insulation and a built-up asphalt gravel covering. This facility is assumed to collapse in the seismic event evaluated in the safety analyses, resulting in the release of a portion of the residual plutonium inventory remaining in the building. The seismic scenario for 232-Z assumes that the block concrete walls collapse, allowing the roof to fall, crushing the contaminated duct and gloveboxes within. This paper is a review of the scenario and methods used to calculate the source term from the seismic event as presented in the Plutonium Finishing Plant Final Safety Analysis Report (WHC 1991) also referred to as the PFP FSAR. Alternate methods of estimating the source term are presented. The calculation of source terms based on the mechanisms of release expected in worst-case scenario is recommended.

  1. Development of a bespoke human factors taxonomy for gliding accident analysis and its revelations about highly inexperienced UK glider pilots.

    PubMed

    Jarvis, Steve; Harris, Don

    2010-02-01

    Low-hours solo glider pilots have a high risk of accidents compared to more experienced pilots. Numerous taxonomies for causal accident analysis have been produced for powered aviation but none of these is suitable for gliding, so a new taxonomy was required. A human factors taxonomy specifically for glider operations was developed and used to analyse all UK gliding accidents from 2002 to 2006 for their overall causes as well as factors specific to low hours pilots. Fifty-nine categories of pilot-related accident causation emerged, which were formed into progressively larger categories until four overall human factors groups were arrived at: 'judgement'; 'handling'; 'strategy'; 'attention'. 'Handling' accounted for a significantly higher proportion of injuries than other categories. Inexperienced pilots had considerably more accidents in all categories except 'strategy'. Approach control (path judgement, airbrake and speed handling) as well as landing flare misjudgement were chiefly responsible for the high accident rate in early solo glider pilots. Statement of Relevance: This paper uses extant accident data to produce a taxonomy of underlying human factors causes to analyse gliding accidents and identify the specific causes associated with low hours pilots. From this specific, well-targeted remedial measures can be identified. PMID:20099182

  2. Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents

    NASA Astrophysics Data System (ADS)

    Minato, Kazuo

    1991-11-01

    In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

  3. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  4. Evaluation of potential severe accidents during low power and shutdown operations at Grand Gulf, Unit 1. Volume 2, Part 1C: Analysis of core damage frequency from internal events for plant operational State 5 during a refueling outage, Main report (Sections 11--14)

    SciTech Connect

    Whitehead, D. [Sandia National Labs., Albuquerque, NM (United States); Darby, J. [Science and Engineering Associates, Inc., Albuquerque, NM (United States); Yakle, J. [Science Applications International Corp., Albuquerque, NM (United States)] [and others

    1994-06-01

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power analyses performed on Grand Gulf.

  5. Analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.

    PubMed

    Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

    2015-05-01

    The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

  6. Development of an engineering methodology for thermal analysis of protected structural members in fire 

    E-print Network

    Liang, Hong; Welch, Stephen

    In order to overcome the limitations of existing methodologies for thermal analysis of protected structural members in fire, a novel CFD-based methodology has been developed. This is a generalised quasi- 3D approach with ...

  7. Nutrient analysis methodology: a review of the DINE developmental literature.

    PubMed

    Dennison, D; Dennison, K F

    1989-12-01

    In 1986, a collaborative effort among professional associations resulted in the publication of Worksite Nutrition: A Decision Maker's Guide (The American Diabetic Association, 1986). The booklet describes nutrient analysis methodology as a good "promotional gimmick". The development of DINE was an effort to move nutrient analysis from the gimmick level to a viable educational component level. A few examples of the innovative effects of this methodology are (1) individuals' using their own data can learn energy balance by monitoring their food intake and physical activity, (2) individuals can learn the Dietary Goals for the United States (U. S. Senate Select Subcommittee on Nutrition and Human Needs, 1977) and are able graphically to compare how their diet approximates or is different from these goals, and (3) individuals can also learn, from verifications of their own food records, which of their food selections were high in calories, total fat, saturated fat, and cholesterol, and low in complex carbohydrates and dietary fiber. Alternative healthful food choices are identified and the effects of reducing or increasing portion sizes is described. The DINE development team has been working for the past eight years to decrease nutrient analysis variability so that the procedure can be used as an effective independent measure to improve nutritional behavior. Research has been conducted related to database validity and reliability. Formative and process evaluations have been conducted to improve interactive aspects of the software and related manuals and books. DINE procedures have been modified for ease of use, in general, and specifically for elementary students and university students. PMID:2516071

  8. A new methodology of spatial cross-correlation analysis.

    PubMed

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  9. A New Methodology of Spatial Cross-Correlation Analysis

    PubMed Central

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  10. SAS4A: A computer model for the analysis of hypothetical core disruptive accidents in liquid metal reactors

    SciTech Connect

    Tentner, A.M.; Birgersson, G.; Cahalan, J.E.; Dunn, F.E.; Kalimullah; Miles, K.J.

    1987-01-01

    To ensure that the public health and safety are protected under any accident conditions in a Liquid Metal Fast Breeder Reactor (LMFBR), many accidents are analyzed for their potential consequences. The SAS4A code system, described in this paper, provides such an analysis capability, including the ability to analyze low probability events such as the Hypothetical Core Disruptive Accidents (HCDAs). The SAS4A code system has been designed to simulate all the events that occur in a LMFBR core during the initiating phase of a Hypothetical Core Disruptive Accident. During such postulated accident scenarios as the Loss-of-Flow and Transient Overpower events, a large number of interrelated physical phenomena occur during a relatively short time. These phenomena include transient heat transfer and hydrodynamic events, coolant boiling and fuel and cladding melting and relocation. During to the strong neutronic feedback present in a nuclear reactor, these events can significantly influence the reactor power. The SAS4A code system is used in the safety analysis of nuclear reactors, in order to estimate the energetic potential of very low probability accidents. The results of SAS4A simulations are also used by reactor designers in order to build safer reactors and eliminate the possibility of any accident which could endanger the public safety.

  11. Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)

    SciTech Connect

    Johnson, E.W.

    1988-10-01

    The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  12. Investigation of Human Factors in UAV Accidents Based on Analysis of Statistical Data

    Microsoft Academic Search

    Manzoor M. Nasir; Qin Shi-Yin

    2011-01-01

    Human errors are held responsible for over 65% of accidents in more than one hundred years of manned aviation history. To evaluate the role of human factors related to accidents of unmanned aerial vehicles (UAVs), a sample data of 56 US Army UAV accidents was used in this study, out of which 32 were related to accidents of Hunter UAV

  13. Analysis on the Density Driven Air-Ingress Accident in VHTRs

    SciTech Connect

    Eung Soo Kim; Chang Oh; Richard Schultz; David Petti

    2008-11-01

    Air-ingress following the pipe rupture is considered to be the most serious accident in the VHTRs due to its potential problems such as core heat-up, structural integrity and toxic gas release. Previously, it has been believed that the main air-ingress mechanism of this accident is the molecular diffusion process between the reactor core and the cavity. However, according to some recent studies, there is another fast air-ingress process that has not been considered before. It is called density-driven stratified flow. The potential for density-driven stratified air ingress into the VHTR following a large-break LOCA was first described in the NGNP Methods Technical Program based on stratified flow studies performed with liquid. Studies on densitygradient driven stratified flow in advanced reactor systems has been the subject of active research for well over a decade since density-gradient dominated stratified flow is an inherent characteristic of passive systems used in advanced reactors. Recently, Oh et al. performed a CFD analysis on the stratified flow in the VHTR, and showed that this effect can significantly accelerate the air-ingress process in the VHTRs. They also proposed to replace the original air-ingress scenario based on the molecular diffusion with the one based on the stratified flow. This paper is focusing on the effect of stratified flow on the results of the air-ingress accident in VHTR

  14. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Gomez del Rio, J; Sanz, J

    2000-02-23

    Previous studies of the safety and environmental (S and E) aspects of the HYLIFE-II inertial fusion energy (IFE) power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work a set of computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) has been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here the authors consider a severe lost of coolant accident (LOCA) producing simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the containment) and of the two barriers surrounding the chamber (inner shielding and containment building it self). Even though containment failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product release and transport. The results of these calculations show that the estimated off-site dose is less than 6 mSv (0.6 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  15. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  16. Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident

    SciTech Connect

    Aldrich, D.C.; Blond, R.M.

    1980-01-01

    An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

  17. Systems thinking, the Swiss Cheese Model and accident analysis: a comparative systemic analysis of the Grayrigg train derailment using the ATSB, AcciMap and STAMP models.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2014-07-01

    The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. PMID:23973170

  18. Marine Propeller Analysis with an Immersed Boundary LES methodology

    NASA Astrophysics Data System (ADS)

    Schroeder, Seth; Balaras, Elias

    2011-11-01

    Modern marine propeller design and analysis techniques employ a wide range of computational methods to balance time constraints and accuracy requirements. Potential flow and RANS based methods have historically comprised the tools necessary for design and global performance analysis. However, for analysis where unsteady flow phenomena are of interest, eddy resolving methodologies are required. In the present study we use the large-eddy simulation (LES) approach coupled with an immersed-boundary (IB) method to perform computations of the flow around a rotating propeller at laboratory Reynolds numbers. Compared to classical boundary conforming strategies our formulation eliminates meshing overhead time and allows for body motion without additional treatments such as overset or dynamic meshing. The structured Cartesian grid also allows for a non-dissipative solver which conserves mass, momentum and energy. We will focus on the Italian Ship Model Basin (INSEAN) E1619 propeller and compare the predictions of our method to experimental results. The E1619 propeller is a 7-bladed submarine stock propeller that has been the subject of extensive experimental testing and previous computational studies. Supported by ONR.

  19. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    SciTech Connect

    Johnson, E.W.

    1985-10-01

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  20. The Multi-fluid Multi-phase Subchannel Analysis Code KAMUI for Subassembly Accident Analysis of an LMFR

    Microsoft Academic Search

    Fumio KASAHARA; Hisashi NINOKATA

    2000-01-01

    A computer program KAMUI has been developed to simulate multi-dimensional thermal-hydraulic behaviors of the coolant and disrupted fuels in a subassembly under hypothetical accident conditions of a Liquid-Metal Fast Reactor (LMFR) where the strong momentum and thermal coupling of the coolant flow dominates the material relocation. The KAMUI code is based on the subchannel analysis approach to model the fuel

  1. Methods for Detector Placement and Analysis of Criticality Accident Alarm Systems

    SciTech Connect

    Peplow, Douglas E. [ORNL] [ORNL; Wetzel, Larry [Babcock & Wilcox Nuclear Operations Group Inc.] [Babcock & Wilcox Nuclear Operations Group Inc.

    2012-01-01

    Determining the optimum placement to minimize the number of detectors for a criticality accident alarm system (CAAS) in a large manufacturing facility is a complex problem. There is typically a target for the number of detectors that can be used over a given zone of the facility. A study to optimize detector placement typically begins with some initial guess at the placement of the detectors and is followed by either predictive calculations of accidents at specific locations or adjoint calculations based on preferred detector locations. Within an area of a facility, there may be a large number of potential criticality accident sites. For any given placement of the detectors, the list of accident sites can be reduced to a smaller number of locations at which accidents may be difficult for detectors to detect. Developing the initial detector placement and determining the list of difficult accident locations are both based on the practitioner's experience. Simulations following fission particles released from an accident location are called 'forward calculations.' These calculations can be used to answer the question 'where would an alarm be triggered?' by an accident at a specified location. Conversely, 'adjoint calculations' start at a detector site using the detector response function as a source and essentially run in reverse. These calculations can be used to answer the question 'where would an accident be detected?' by a specified detector location. If the number of accidents, P, is much less than the number of detectors, Q, then forward simulations may be more convenient and less time-consuming. If Q is large or the detectors are not placed yet, then a mesh tally of dose observed by a detector at any location must be computed over the entire zone. If Q is much less than P, then adjoint calculations may be more efficient. Adjoint calculations employing a mesh tally can be even more advantageous because they do not rely on a list of specific difficult-to-detect accident sites, which may not have included every possible accident location. Analog calculations (no biasing) simply follow particles naturally. For sparse buildings and line-of-sight calculations, analog Monte Carlo (MC) may be adequate. For buildings with internal walls or large amounts of heavy equipment (dense geometry), variance reduction may be required. Calculations employing the CADIS method use a deterministic calculation to create an importance map and a matching biased source distribution that optimize the final MC to quickly calculate one specific tally. Calculations employing the FW-CADIS method use two deterministic calculations (one forward and one adjoint) to create an importance map and a matching biased source distribution that are designed to make the MC calculate a mesh tally with more uniform uncertainties in both high-dose and low-dose areas. Depending on the geometry of the problem, the number of detectors, and the number of accident sites, different approaches to CAAS placement studies can be taken. These are summarized in Table I. SCALE 6.1 contains the MAVRIC sequence, which can be used to perform any of the forward-based approaches outlined in Table I. For analog calculations, MAVRIC simply calls the Monaco MC code. For CADIS and FW-CADIS, MAVRIC uses the Denovo discrete ordinates (SN) deterministic code to generate the importance map and biased source used by Monaco. An adjoint capability is currently being added to Monaco and should be available in the next release of SCALE. An adjoint-based approach could be performed with Denovo alone - although fine meshes, large amounts of memory, and long computation times may be required to obtain accurate solutions. Coarse-mesh SN simulations could be employed for adjoint-based scoping studies until the adjoint capability in Monaco is complete. CAAS placement studies, especially those dealing with mesh tallies, require some extra utilities to aid in the analysis. Detectors must receive a minimum dose rate in order to alarm; therefore, a simple yes/no plot could be more useful to the analyst t

  2. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  3. Rough set approach for accident chains exploration

    Microsoft Academic Search

    Jinn-Tsai Wong; Yi-Shih Chung

    2007-01-01

    This paper presents a novel non-parametric methodology – rough set theory – for accident occurrence exploration. The rough set theory allows researchers to analyze accidents in multiple dimensions and to model accident occurrence as factor chains. Factor chains are composed of driver characteristics, trip characteristics, driver behavior and environment factors that imply typical accident occurrence. A real-world database (2003 Taiwan

  4. Analysis of the source range monitor during the first four hours of the Three Mile Island Unit 2 accident

    SciTech Connect

    Wu, H.Y.; Bandini, B.R. (Institute of Nuclear Energy Research, Lungtan (TW)); Hsiao, M.Y.; Baratta, A.J.; Bandini, B.R. (Pennsylvania State Univ., University Park, PA (USA). Dept. of Nuclear Engineering); Tolman, E.L. (Idaho National Engineering Lab., Idaho Falls, ID (USA))

    1989-02-01

    The source range monitor (SRM) data recorded during the first 4 h of the Three Mile Island Unit 2 (TMI-2) accident following reactor shutdown were analyzed. An effort to simulate the actual SRM response was made by performing a series of neutron transport calculations. Primary emphasis was placed on simulating the changes in SRM response to various system events during the accident so as to obtain useful information about core conditions at the various stages. Based on the known end-state reactor conditions, the major system events and the actual SRM readings, self-consistent estimates were made of core liquid level, void fraction in the coolant, and locations of core materials. This analysis expands the possible interpretation of the SRM data relative to core damage progression. The results appear to be consistent with other studies of the TMI-2 Accident Evaluation Program, and provide information useful for the development and determination of the TMI-2 accident scenario.

  5. A Risk Analysis Methodology to Address Human and Organizational Factors in Offshore Drilling Safety: With an Emphasis on Negative Pressure Test

    NASA Astrophysics Data System (ADS)

    Tabibzadeh, Maryam

    According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis

  6. MELCOR Analysis of Steam Generator Tube Creep Rupture in Station Blackout Severe Accident

    SciTech Connect

    Liao, Y.; Vierow, K. [Purdue University (United States)

    2005-12-15

    A pressurized water reactor steam generator tube rupture (SGTR) is of concern because it represents a bypass of the containment for radioactive materials to the environment. In a station blackout accident, tube integrity could be threatened by creep rupture, particularly if cracks are present in the tube walls. Methods are developed herein to improve assessment capabilities for SGTR by using the severe-accident code MELCOR. Best-estimate assumptions based on recent research and computational fluid dynamics calculations are applied in the MELCOR analysis to simulate two-dimensional natural circulation and to determine the relative creep-rupture timing in the reactor coolant pressure boundary components. A new method is developed to estimate the steam generator (SG) hottest tube wall temperature and the tube critical crack size for the SG tubes to fail first. The critical crack size for SG tubes to fail first is estimated to be 20% of the wall thickness larger than by a previous analysis. Sensitivity studies show that the failure sequence would change if some assumptions are modified. In particular, the uncertainty in the countercurrent flow limit model could reverse the failure sequence of the SG tubes and surge line.

  7. Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR

    SciTech Connect

    Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T. [Japan Nuclear Energy Safety Organization JNES, Toranomon Towers Office, 4-1-28, Toranomon, Minato-ku, Tokyo (Japan); Shirakawa, N. [Inst. of Applied Energy IAE, Shimbashi SY Bldg., 14-2 Nishi-Shimbashi 1-Chome, Minato-ku, Tokyo (Japan)

    2012-07-01

    The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

  8. Bayesian data analysis of severe fatal accident risk in the oil chain.

    PubMed

    Eckle, Petrissa; Burgherr, Peter

    2013-01-01

    We analyze the risk of severe fatal accidents causing five or more fatalities and for nine different activities covering the entire oil chain. Included are exploration and extraction, transport by different modes, refining and final end use in power plants, heating or gas stations. The risks are quantified separately for OECD and non-OECD countries and trends are calculated. Risk is analyzed by employing a Bayesian hierarchical model yielding analytical functions for both frequency (Poisson) and severity distributions (Generalized Pareto) as well as frequency trends. This approach addresses a key problem in risk estimation-namely the scarcity of data resulting in high uncertainties in particular for the risk of extreme events, where the risk is extrapolated beyond the historically most severe accidents. Bayesian data analysis allows the pooling of information from different data sets covering, for example, the different stages of the energy chains or different modes of transportation. In addition, it also inherently delivers a measure of uncertainty. This approach provides a framework, which comprehensively covers risk throughout the oil chain, allowing the allocation of risk in sustainability assessments. It also permits the progressive addition of new data to refine the risk estimates. Frequency, severity, and trends show substantial differences between the activities, emphasizing the need for detailed risk analysis. PMID:22642363

  9. Process hazards analysis (PrHA) program, bridging accident analyses and operational safety

    SciTech Connect

    Richardson, J. A. (Jeanne A.); McKernan, S. A. (Stuart A.); Vigil, M. J. (Michael J.)

    2003-01-01

    Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker safety are incorporated so the worker can readily identify the safety parameters of the their work. System safety tools such as Preliminary Hazard Analysis, What-If Analysis, Hazard and Operability Analysis as well as other techniques as necessary provide the groundwork for both determining bounding conditions for facility safety, operational safety, and day-to-clay worker safety.

  10. An Analysis of "HCR"'s Theoretical and Methodological Evolution.

    ERIC Educational Resources Information Center

    Violanti, Michelle T.

    1999-01-01

    Traces theoretical and methodological contributions this journal has made to the field of communication. Finds that the journal has matured both theoretically and methodologically: 59% of the articles in the first 24 years had a theory or model driving the research; the journal remains almost exclusively quantitative; and no single theory or…

  11. Hazardous materials transportation: a risk-analysis-based routing methodology.

    PubMed

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-01

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed. PMID:10677666

  12. The methodology of multi-viewpoint clustering analysis

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala; Wild, Chris

    1993-01-01

    One of the greatest challenges facing the software engineering community is the ability to produce large and complex computer systems, such as ground support systems for unmanned scientific missions, that are reliable and cost effective. In order to build and maintain these systems, it is important that the knowledge in the system be suitably abstracted, structured, and otherwise clustered in a manner which facilitates its understanding, manipulation, testing, and utilization. Development of complex mission-critical systems will require the ability to abstract overall concepts in the system at various levels of detail and to consider the system from different points of view. Multi-ViewPoint - Clustering Analysis MVP-CA methodology has been developed to provide multiple views of large, complicated systems. MVP-CA provides an ability to discover significant structures by providing an automated mechanism to structure both hierarchically (from detail to abstract) and orthogonally (from different perspectives). We propose to integrate MVP/CA into an overall software engineering life cycle to support the development and evolution of complex mission critical systems.

  13. Narrative text analysis of accident reports with tractors, self-propelled harvesting machinery and materials handling machinery in Austrian agriculture from 2008 to 2010 - a comparison.

    PubMed

    Mayrhofer, Hannes; Quendler, Elisabeth; Boxberger, Josef

    2014-01-01

    The aim of this study was the identification of accident scenarios and causes by analysing existing accident reports of recognized agricultural occupational accidents with tractors, self-propelled harvesting machinery and materials handling machinery from 2008 to 2010. As a result of a literature-based evaluation of past accident analyses, the narrative text analysis was chosen as an appropriate method. A narrative analysis of the text fields of accident reports that farmers used to report accidents to insurers was conducted to obtain detailed information about the scenarios and causes of accidents. This narrative analysis of reports was made the first time and yielded first insights for identifying antecedents of accidents and potential opportunities for technical based intervention. A literature and internet search was done to discuss and confirm the findings. The narrative text analysis showed that in more than one third of the accidents with tractors and materials handling machinery the vehicle rolled or tipped over. The most relevant accident scenarios with harvesting machinery were being trapped and falling down. The direct comparison of the analysed machinery categories showed that more than 10% of the accidents in each category were caused by technical faults, slippery or muddy terrain and incorrect or inappropriate operation of the vehicle. Accidents with tractors, harvesting machinery and materials handling machinery showed similarities in terms of causes, circumstances and consequences. Certain technical and communicative measures for accident prevention could be used for all three machinery categories. Nevertheless, some individual solutions for accident prevention, which suit each specific machine type, would be necessary. PMID:24738521

  14. Synchronous machine stability analysis using an efficient time domain methodology: unbalanced operation analysis

    Microsoft Academic Search

    O. Rodriguez; A. Medina

    2002-01-01

    This paper describes a methodology for the efficient stability analysis of the synchronous machine. The state space phase coordinates model of the synchronous machine incorporates the effects of time-varying and inter-spatial harmonic inductances. The dynamics of the synchronous machine are represented by a set of ordinary differential equations (ODEs). The machine behaviour is analyzed under severe unbalanced operation conditions. The

  15. Vehicle-mounted mine detection: test methodology, application, and analysis

    NASA Astrophysics Data System (ADS)

    Hanshaw, Terilee

    1998-09-01

    The Mine/Minefield detection community's maturing technology base has become a developmental resource for world wide military and humanitarian applications. During the last decade, this community has developed a variety of single and multi-sensor applications incorporating a diversity of sensor and processor technologies. These diverse developments from the Mine/Minefield detection community require appropriate metrics to objectively bound technology and to define applicability to expected military and humanitarian applications. This paper presents a survey of the test methodology, application and analysis activities conducted by the U.S. Army Communications and Electronics Command's, Night Vision and Electronic Sensors Directorate (NVESD) on behalf of the Mine/Minefield detection community. As needs of world wide military and humanitarian mine detection activities are being responded to by notable technology base advances, a diverse pool of knowledge has been developed. The maturity of these technology base advances must be evaluated in a more systematic method. As these technologies mature, metrics have been developed to support the development process and to define the applicability of these technology base advances. The author will review the diversity of the mine detection technology and their related testing strategies. Consideration is given to the impact of history and global realism on the U.S. Army's present mine detection testing program. Further, definitions of testing metrics and analysis will be reviewed. Finally the paper will outline future U.S. Army testing plans with a special consideration given to the Vehicular Mounted Mine Detection/Ground Standoff Mine Detection System (VMMD/GSTAMIDS) Advanced Technology Demonstration and related issues.

  16. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    Microsoft Academic Search

    M. K. Goldhaber; S. L. Staub; G. K. Tokuhata

    1983-01-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected.

  17. Analysis of traffic accident size for Korean highway using structural equation models

    Microsoft Academic Search

    Ju-Yeon Lee; Jin-Hyuk Chung; Bongsoo Son

    2008-01-01

    Accident size can be expressed as the number of involved vehicles, the number of damaged vehicles, the number of deaths and\\/or the number of injured. Accident size is the one of the important indices to measure the level of safety of transportation facilities. Factors such as road geometric condition, driver characteristic and vehicle type may be related to traffic accident

  18. Analysis of the SL-1 Accident Using RELAPS5-3D

    SciTech Connect

    Francisco, A.D. and Tomlinson, E. T.

    2007-11-08

    On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

  19. Analysis of station blackout accidents for the Bellefonte pressurized water reactor

    SciTech Connect

    Gasser, R D; Bieniarz, P P; Tills, J L

    1986-09-01

    An analysis has been performed for the Bellefonte PWR Unit 1 to determine the containment loading and the radiological releases into the environment from a station blackout accident. A number of issues have been addressed in this analysis which include the effects of direct heating on containment loading, and the effects of fission product heating and natural convection on releases from the primary system. The results indicate that direct heating which involves more than about 50% of the core can fail the Bellefonte containment, but natural convection in the RCS may lead to overheating and failure of the primary system piping before core slump, thus, eliminating or mitigating direct heating. Releases from the primary system are significantly increased before vessel breach due to natural circulation and after vessel breach due to reevolution of retained fission products by fission product heating of RCS structures.

  20. Analysis of Developmental Sequences within the Structural Approach: Conceptual, Empirical, and Methodological Considerations.

    ERIC Educational Resources Information Center

    Schroder, Eberhard; Edelstein, Wolfgang

    In this paper conceptual and methodological issues in the analysis of developmental sequences are discussed. Conceptually, the reconstruction of the logic of acquisition calls for the use of task or structure analysis. Methodologically, it calls for an individual-oriented approach, the use of statement calculus for formulation of the postulated…

  1. The development and demonstration of integrated models for the evaluation of severe accident management strategies—SAMEM

    Microsoft Academic Search

    M. L Ang; K Peers; E Kersting; W Fassmann; H Tuomisto; P Lundström; M Helle; V Gustavsson; P Jacobsson

    2001-01-01

    This study is concerned with the further development of integrated models for the assessment of existing and potential severe accident management (SAM) measures. This paper provides a brief summary of these models, based on Probabilistic Safety Assessment (PSA) methods and the Risk Oriented Accident Analysis Methodology (ROAAM) approach, and their application to a number of case studies spanning both preventive

  2. Survivors Perceptions of Recovery following Air Medical Transport Accidents.

    PubMed

    Jaynes, Cathy L; Valdez, Anna; Hamilton, Megan; Haugen, Krista; Henry, Colin; Jones, Pat; Werman, Howard A; White, Lynn J

    2015-01-01

    Abstract Objective: Air medical transport (AMT) teams play an essential role in the care of the critically ill and injured. Their work, however, is not without risk. Since the inception of the industry numerous AMT accidents have been reported. The objective of this research is to gain a better understanding of the post-accident sequelae for professionals who have survived AMT accidents. The hope is that this understanding will empower the industry to better support survivors and plan for the contingencies of post-accident recovery. Methods: Qualitative methods were used to explore the experience of flight crew members who have survived an AMT accident. "Accident" was defined using criteria established by the National Transportation Safety Board. Traditional focus group methodology explored the survivors' experiences following the accident. Results: Seven survivors participated in the focus group. Content analysis revealed themes in four major domains that described the experience of survivors: Physical, Psychological, Relational and Financial. Across the themes survivors reported that industry and company response varied greatly, ranging from generous support, understanding and action to make safety improvements, to little response or action and lack of attention to survivor needs. Conclusion: Planning for AMT post-accident response was identified to be lacking in scope and quality. More focused efforts are needed to assist and support the survivors as they regain both their personal and professional lives following the accident. This planning should include all stakeholders in safe transport; the individual crewmember, air medical transport companies, and the industry at large. PMID:24932568

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Analysis of accidents during the mid-loop operating state at a PWR

    SciTech Connect

    Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

    1992-12-31

    Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

  5. Analysis of accidents during the mid-loop operating state at a PWR

    SciTech Connect

    Jo, J.; Lin, C.C.; Mufayi, V.; Neymotin, L.; Nimnual, S.

    1992-01-01

    Studies suggest that the risk of severe accidents during low power operation and/or shutdown conditions could be a significant fraction of the risk at full power operation. The Nuclear Regulatory Commission has begun two risk studies to evaluate the progression of severe accidents during these conditions: one for the Surry plant, a pressurized water reactor (PWR), and the other for the Grand Gulf plant, a boiling water reactor (BWR). This paper summarizes the approach taken for the Level 2/3 analysis at Surry for one plant operating state (POS) during shutdown. The current efforts are focussed on evaluating the risk when the reactor is at mid-loop; this particular POS was selected because of the reduced water inventory and the possible isolation of the loops. The Level 2/3 analyses are conditional on core damage having occurred. Initial results indicate that the conditional consequences can indeed be significant; the defense-in-depth philosophy governing the safety of nuclear power plants is to some extent circumvented because the containment provides only a vapor barrier with no capability for pressure holding, during this POS at Surry. However, the natural decay of the radionuclide inventory provides some mitigation. There are essentially no predicted offsite prompt fatalities even for the most severe releases.

  6. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS

    SciTech Connect

    Wu, T

    2008-04-30

    Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

  8. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors

    SciTech Connect

    Pate-Cornell, M.E. (Stanford Univ., CA (United States))

    1993-04-01

    The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

  10. Accident management information needs

    SciTech Connect

    Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. (EG and G Idaho, Inc., Idaho Falls, ID (USA))

    1990-04-01

    In support of the US Nuclear Regulatory Commission (NRC) Accident Management Research Program, a methodology has been developed for identifying the plant information needs necessary for personnel involved in the management of an accident to diagnose that an accident is in progress, select and implement strategies to prevent or mitigate the accident, and monitor the effectiveness of these strategies. This report describes the methodology and presents an application of this methodology to a Pressurized Water Reactor (PWR) with a large dry containment. A risk-important severe accident sequence for a PWR is used to examine the capability of the existing measurements to supply the necessary information. The method includes an assessment of the effects of the sequence on the measurement availability including the effects of environmental conditions. The information needs and capabilities identified using this approach are also intended to form the basis for more comprehensive information needs assessment performed during the analyses and development of specific strategies for use in accident management prevention and mitigation. 3 refs., 16 figs., 7 tabs.

  11. Analysis of Radionuclide Releases from the Fukushima Dai-Ichi Nuclear Power Plant Accident Part I

    NASA Astrophysics Data System (ADS)

    Le Petit, G.; Douysset, G.; Ducros, G.; Gross, P.; Achim, P.; Monfort, M.; Raymond, P.; Pontillon, Y.; Jutier, C.; Blanchard, X.; Taffary, T.; Moulin, C.

    2014-03-01

    Part I of this publication deals with the analysis of fission product releases consecutive to the Fukushima Dai-ichi accident. Reactor core damages are assessed relying on radionuclide detections performed by the CTBTO radionuclide network, especially at the particulate station located at Takasaki, 210 km away from the nuclear power plant. On the basis of a comparison between the reactor core inventory at the time of reactor shutdowns and the fission product activities measured in air at Takasaki, especially 95Nb and 103Ru, it was possible to show that the reactor cores were exposed to high temperature for a prolonged time. This diagnosis was confirmed by the presence of 113Sn in air at Takasaki. The 133Xe assessed release at the time of reactor shutdown (8 × 1018 Bq) turned out to be in the order of 80 % of the amount deduced from the reactor core inventories. This strongly suggests a broad meltdown of reactor cores.

  12. Conceptual design loss-of-coolant accident analysis for the Advanced Neutron Source reactor

    SciTech Connect

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L. Jr. (Oak Ridge National Lab., TN (United States))

    1994-01-01

    A RELAP5 system model for the Advanced Neutron Source Reactor has been developed for performing conceptual safety analysis report calculations. To better represent thermal-hydraulic behavior of the core, three specific changes in the RELAP5 computer code were implemented: a turbulent forced-convection heat transfer correlation, a critical heat flux (CHF) correlation, and an interfacial drag correlation. The model consists of the core region, the heat exchanger loop region, and the pressurizing/letdown system region. Results for three loss-of-coolant accident analyses are presented: (1) an instantaneous double-ended guillotine (DEG) core outlet break with a cavitating venturi installed downstream of the core, (b) a core pressure boundary tube outer wall rupture, and (c) a DEG core inlet break with a finite break-formation time. The results show that the core can survive without exceeding the flow excursion of CHF thermal limits at a 95% probability level if the proper mitigation options are provided.

  13. PTSD symptom severity and psychiatric comorbidity in recent motor vehicle accident victims: a latent class analysis.

    PubMed

    Hruska, Bryce; Irish, Leah A; Pacella, Maria L; Sledjeski, Eve M; Delahanty, Douglas L

    2014-10-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  14. Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor

    SciTech Connect

    Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

    1992-10-01

    This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

  15. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  17. Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H

    SciTech Connect

    Blanchard, A.

    1999-05-10

    'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

  18. Regulatory analysis for the resolution of Generic Issue 82, ''Beyond design basis accidents in spent fuel pools''

    Microsoft Academic Search

    Throm

    1989-01-01

    Generic Issue 82, ''Beyond Design Basis Accidents in Spent Fuel Pools,'' addresses the concerns with the use of high density storage racks for the storage of spent fuel, and is applicable to all Light Water Reactor spent fuel pools. This report presents the regulatory analysis for Generic Issue 82. It includes (1) a summary of the issue, (2) a summary

  19. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    Microsoft Academic Search

    N. J. Lombardo; T. J. Marseille; M. D. White; P. S. Lowery

    1990-01-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes

  20. Analysis of the crush environment for lightweight air-transportable accident-resistant containers

    SciTech Connect

    McClure, J.D.; Hartman, W.F.

    1981-12-01

    This report describes the longitudinal dynamic crush environment for a Lightweight Air-Transportable Accident-Resistant Container (LAARC, now called PAT-2) that can be used to transport small quantities of radioactive material. The analysis of the crush environment involves evaluation of the forces imposed upon the LAARC package during the crash of a large, heavily loaded, cargo aircraft. To perform the analysis, a cargo load column was defined which consisted of a longitudinal prism of cargo of cross-sectional area equal to the projected area of the radioactive-material package and length equal to the longitudinal extent of the cargo compartment in a commercial cargo jet aircraft. To bound the problem, two analyses of the cargo load column were performed, a static stability analysis and a dynamic analysis. The results of these analyses can be applied to other packaging designs and suggest that the physical limits or magnitude of the longitudinal crush forces, which are controlled in part by the yield strength of the cargo and the package size, are much smaller than previously estimated.

  1. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    ERIC Educational Resources Information Center

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  2. Methodology for the survival analysis of urban building stocks

    Microsoft Academic Search

    Patrick Erik Bradley; Niklaus Kohler

    2007-01-01

    A new methodology is presented for estimating age distribution and survival functions of an urban building stock. This allows for a random sample of the undemolished stock together with a complete inventory count of the demolished part to create a Kaplan–Meier estimator for the survival function. This method can be applied to any building stock with difficult access to data.

  3. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    ERIC Educational Resources Information Center

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  4. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Microsoft Academic Search

    C. Y. Kimura; R. E. Glaser; R. W. Mensing; T. Lin; T. A. Haley; A. B. Barto; M. A. Stutzke

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE\\/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team,

  5. Cost-Effectiveness Analysis of Prenatal Diagnosis: Methodological Issues and Concerns

    Microsoft Academic Search

    Aaron B. Caughey

    2005-01-01

    With increasing concerns regarding rapidly expanding health care costs, cost-effectiveness analysis (CEA) provides a methodology to assess whether marginal gains from new technology are worth the increased costs. In the arena of prenatal diagnosis, particular methodological and ethical concerns include whether the effects of such testing on individuals other than the patient are included, how termination of pregnancy is included

  6. International Workshop on Analysis Tools and Methodologies for Embedded and Real-time

    E-print Network

    Lipari, Giuseppe

    1st International Workshop on Analysis Tools and Methodologies for Embedded and Real-time Systems, research in the field of real-time and embedded systems would greatly ben- efit from the availability of the International Workshop on Anaysis Tools and Methodologies for Embedded and Real-time Systems (WATERS

  7. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  8. An exploratory multinomial logit analysis of single-vehicle motorcycle accident severity

    Microsoft Academic Search

    Venkataraman Shankar; Fred Mannering

    1996-01-01

    Most previous research on motorcycle accident severity has focused on univariate relationships between severity and an explanatory variable of interest (e.g., helmet use). The potential ambiguity and bias that univariate analyses create in identifying the causality of severity has generated the need for multivariate analyses in which the effects of all factors that influence accident severity are considered. This paper

  9. Insights from review and analysis of the Fukushima Dai-ichi accident

    Microsoft Academic Search

    Masashi Hirano; Taisuke Yonomoto; Masahiro Ishigaki; Norio Watanabe; Yu Maruyama; Yasuteru Sibamoto; Tadashi Watanabe; Kiyofumi Moriyama

    2012-01-01

    An unprecedented earthquake and tsunami struck the Fukushima Dai-ichi Nuclear Power Plants on 11 March 2011. Although extensive efforts have been continuing on investigations into the causes and consequences of the accident, and the Japanese Government has presented a comprehensive report on the accident in the IAEA Ministerial Conference held in June 2011, there is still much to be clarified

  10. An analysis of Three Mile Island: the accident that shouldn't have happened

    Microsoft Academic Search

    E. Rubinstein; J. F. Mason

    1979-01-01

    The sequence of events in the nuclear reactor accident at Three Mile Island on March 28, 1979, is reported. Three problems thought to trigger the reactor accident were a persistent leak of reactor coolant, a closing of two valves in the auxillary feedwater system, and an apparent resin blockage in the transfer line that forced water back into the condensate

  11. Building Energy Performance Analysis of an Academic Building Using IFC BIM-Based Methodology

    E-print Network

    Aziz, Z.; Arayici, Y.; Shivachev, D.

    2012-01-01

    This paper discusses the potential to use an Industry Foundation Classes (IFC)/Building Information Modelling (BIM) based method to undertake Building Energy Performance analysis of an academic building. BIM/IFC based methodology provides a...

  12. Computational Methodologies for Transcript Analysis in the Age of Next-Generation DNA Sequencing

    E-print Network

    Gerstein, Mark

    a cloud-computing environmentAbstract Computational Methodologies for Transcript Analysis in the Age of Next-Generation DNA. However, new computational methods are required to take advantage of the burgeoning volumes of data

  13. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    SciTech Connect

    J. Scaglione

    1999-09-09

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

  14. A methodological procedure for the analysis of the Wenxian covenant texts

    E-print Network

    Williams, Crispin; ???

    2005-01-01

    AS/EA LIX•1•2005, S. 61–114 A METHODOLOGICAL PROCEDURE FOR THE ANALYSIS OF THE WENXIAN COVENANT TEXTS Crispin Williams, Dartmouth College Abstract This article introduces a systematic methodological procedure for the analysis of Chinese... of Cultural Relics and Archaeology in Zhengzhou, where the tablets are housed, has provided ongoing support for the project, as has the Cultural Relics Bureau at both the provincial and national level. 62 CRISPIN WILLIAMS AS/EA LIX•1•2005, S. 61...

  15. Transient analysis for thermal margin with COASISO during a severe accident

    SciTech Connect

    Kim, Chan S.; Chu, Ho S.; Suh, Kune Y.; Park, Goon C.; Lee, Un C. [Seoul National University, San 56-1, Sillim-Dong, Kwanak-Gu, Seoul, 151-742 (Korea, Republic of); Yoon, Ho J. [Purdue University, West Lafayette, IN 47907 (United States)

    2002-07-01

    As an IVR-EVC (in-vessel retention through external vessel cooling) design concept, external cooling of the reactor vessel was suggested to protect the lower head from being overheated due to relocated material from the core during a severe accident. The COASISO (Corium Attack Syndrome Immunization Structure Outside the vessel) adopts an external vessel cooling strategy of flooding the reactor vessel inside the thermal insulator. Its advantage is the quick response time so that the initial heat removal mechanism of the EVC is nucleate boiling from the downward-facing lower head. The efficiency of the COASISO may be estimated by the thermal margin defined as the ratio of the actual heat flux from the reactor vessel to the critical heat flux (CHF). In this study the thermal margin for the large power reactor as the APR1400 (Advanced Power Reactor 1400 MWe) was determined by means of transient analysis for the local condition of the coolant and temperature distributions within the reactor vessel. The heat split fraction in the oxide pool and the metal layer focusing effect were considered during calculation of the angular thermal load at the inner wall of the lower head. The temperature distributions in the reactor vessel resulted in the actual heat flux on the outer wall. The local quality was obtained by solving the simplified transient energy equation. The unheated section of the reactor vessel decreases the thermal margin by mean of the two-dimensional conduction heat transfer. The peak temperature of the reactor vessel was estimated in the film boiling region as the thermal margin was equal to unity. Sensitivity analyses were performed for the time of corium relocation after the reactor trip, the coolant flow rate, and the initial subcooled condition of the coolant. There is no vessel failure predicted at the worst EVC condition when the stratification is not taken into account between the metal layer and the oxidic pool. The present predictive tool may be implemented in the severe accident analysis code like MAAP4 for the external vessel cooling with the COASISO. (authors)

  16. The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

  17. Participatory Analysis of Synchronous Collaborative Problem Solving using the OCAF methodology and tools

    Microsoft Academic Search

    Nikos Avouris; Angeligue Dimitracopoulou; Vassilis Komis; Meletis Margaritis

    2003-01-01

    This interactive event aims at introducing the participants in analysis of collaborative problem-solving activities, in which they are going to be first involved themselves, using the OCAF (Object-oriented collaboration analysis framework) methodology and Collaboration analysis tools. Overview This interactive event is planned to take place in a computer laboratory. It evolves in three phases: During the first stage, the participants

  18. An Improvement on Horn's Parallel Analysis Methodology for Selecting the Correct Number of Factors to Retain

    Microsoft Academic Search

    Louis W. Glorfeld

    1995-01-01

    One of the most important decisions that can be made in the use of factor analysis is the number of factors to retain. Numerous studies have consistently shown that Horn's parallel analysis is the most nearly accurate methodology for determining the number of factors to retain in an exploratory factor analysis. Although Horn's procedure is relatively accurate, it still tends

  19. Methodology for Multifractal Analysis of Heart Rate Variability: From LF/HF Ratio to Wavelet Leaders

    E-print Network

    Gonçalves, Paulo

    Methodology for Multifractal Analysis of Heart Rate Variability: From LF/HF Ratio to Wavelet introduction to the practical use of wavelet Leader based multifractal analysis to study heart rate variability to other standard characterizations of heart rate variability: (mono)fractal analysis, Hurst exponent

  20. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  1. Behavior of an heterogeneous annular FBR core during an unprotected loss of flow accident: Analysis of the primary phase with SAS-SFR

    SciTech Connect

    Massara, S.; Schmitt, D.; Bretault, A.; Lemasson, D.; Darmet, G.; Verwaerde, D. [EDF R and D, 1, Avenue du General de Gaulle, 92141 Clamart (France); Struwe, D.; Pfrang, W.; Ponomarev, A. [Karlsruher Institut fuer Technologie KIT, Institut fuer Neutronenphysik und Reaktortechnik INR, Hermann-von-Helmholtz-Platz 1, Gebaude 521, 76344 Eggenstein-Leopoldshafen (Germany)

    2012-07-01

    In the framework of a substantial improvement on FBR core safety connected to the development of a new Gen IV reactor type, heterogeneous core with innovative features are being carefully analyzed in France since 2009. At EDF R and D, the main goal is to understand whether a strong reduction of the Na-void worth - possibly attempting a negative value - allows a significant improvement of the core behavior during an unprotected loss of flow accident. Also, the physical behavior of such a core is of interest, before and beyond the (possible) onset of Na boiling. Hence, a cutting-edge heterogeneous design, featuring an annular shape, a Na-plena with a B{sub 4}C plate and a stepwise modulation of fissile core heights, was developed at EDF by means of the SDDS methodology, with a total Na-void worth of -1 $. The behavior of such a core during the primary phase of a severe accident, initiated by an unprotected loss of flow, is analyzed by means of the SAS-SFR code. This study is carried-out at KIT and EDF, in the framework of a scientific collaboration on innovative FBR severe accident analyses. The results show that the reduction of the Na-void worth is very effective, but is not sufficient alone to avoid Na-boiling and, hence, to prevent the core from entering into the primary phase of a severe accident. Nevertheless, the grace time up to boiling onset is greatly enhanced in comparison to a more traditional homogeneous core design, and only an extremely low fraction of the fuel (<0.1%) enters into melting at the end of this phase. A sensitivity analysis shows that, due to the inherent neutronic characteristics of such a core, the gagging scheme plays a major role on the core behavior: indeed, an improved 4-zones gagging scheme, associated with an enhanced control rod drive line expansion feed-back effect, finally prevents the core from entering into sodium boiling. This major conclusion highlights both the progress already accomplished and the need for more detailed future analyses, particularly concerning: the neutronic burn-up scheme, the modeling of the diagrid effect and the control rod drive line expansion feed-backs, as well as the primary/secondary systems thermal-hydraulics behavior. (authors)

  2. Methodology for Computer-Aided Fault Tree Analysis

    Microsoft Academic Search

    R. Ferdous; F. I. Khan; B. Veitch; P. R. Amyotte

    2007-01-01

    Fault tree analysis is a systematic, deductive and probabilistic risk assessment tool which elucidates the causal relations leading to a given undesired event. Quantitative fault tree (failure) analysis requires a fault tree and failure data of basic events. Development of a fault tree and subsequent analysis require a great deal of expertise, which may not be available all the time.

  3. Advanced neutron source reactor conceptual safety analysis report, three-element-core design: Chapter 15, accident analysis

    SciTech Connect

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L.; Harrington, R.M.

    1996-02-01

    In order to utilize reduced enrichment fuel, the three-element-core design for the Advanced Neutron Source has been proposed. The proposed core configuration consists of inner, middle, and outer elements, with the middle element offset axially beneath the inner and outer elements, which are axially aligned. The three-element-core RELAP5 model assumes that the reactor hardware is changed only within the core region, so that the loop piping, heat exchangers, and pumps remain as assumed for the two-element-core configuration. To assess the impact of changes in the core region configuration and the thermal-hydraulic steady-state conditions, the safety analysis has been updated. This report gives the safety margins for the loss-of-off-site power and pressure-boundary fault accidents based on the RELAP5 results. AU margins are greater for the three-element-core simulations than those calculated for the two-element core.

  4. The effect of gamma-ray transport on afterheat calculations for accident analysis

    SciTech Connect

    Reyes, S.; Latkowski, J.F.; Sanz, J.

    2000-05-01

    Radioactive afterheat is an important source term for the release of radionuclides in fusion systems under accident conditions. Heat transfer calculations are used to determine time-temperature histories in regions of interest, but the true source term needs to be the effective afterheat, which considers the transport of penetrating gamma rays. Without consideration of photon transport, accident temperatures may be overestimated in others. The importance of this effect is demonstrated for a simple, one-dimensional problem. The significance of this effect depends strongly on the accident scenario being analyzed.

  5. Lower head creep rupture failure analysis associated with alternative accident sequences of the Three Mile Island Unit 2

    SciTech Connect

    Sang Lung, Chan [Swiss Federal Institute of Technology Zurich and Swiss Federal Nuclear Safety Inspectorate, Zurich, Switzerland, 8001 (Switzerland)

    2004-07-01

    The objective of this lower head creep rupture analysis is to assess the current version of MELCOR 1.8.5-RG against SCDAP/RELAP5 MOD 3.3kz. The purpose of this assessment is to investigate the current MELCOR in-vessel core damage progression phenomena including the model for the formation of a molten pool. The model for stratified molten pool natural heat transfer will be included in the next MELCOR release. Presently, MELCOR excludes the gap heat-transfer model for the cooling associated with the narrow gap between the debris and the lower head vessel wall. All these phenomenological models are already treated in SCDAP/RELAP5 using the COUPLE code to model the heat transfer of the relocated debris with the lower head based on a two-dimensional finite-element-method. The assessment should determine if current MELCOR capabilities adequately cover core degradation phenomena appropriate for the consolidated MELCOR code. Inclusion of these features should bring MELCOR much closer to a state of parity with SCDAP/RELAP5 and is a currently underway element in the MELCOR code consolidation effort. This assessment deals with the following analysis of the Three Mile Island Unit 2 (TMI-2) alternative accident sequences. The TMI-2 alternative accident sequence-1 includes the continuation of the base case of the TMI-2 accident with the Reactor Coolant Pumps (RCP) tripped, and the High Pressure Injection System (HPIS) throttled after approximately 6000 s accident time, while in the TMI-2 alternative accident sequence-2, the reactor coolant pumps is tripped after 6000 s and the HPIS is activated after 12,012 s. The lower head temperature distributions calculated with SCDAP/RELAP5 are visualized and animated with open source visualization freeware 'OpenDX'. (author)

  6. Analysis of criticality accident alarm system coverage in the X-700, X-705, and X-720 facilities at the Portsmouth Gaseous Diffusion plant

    SciTech Connect

    Skapik, C.W.; Dobelbower, M.C.; Woollard, J.E. [and others] [and others

    1995-12-01

    Additional services for the uranium enrichment cascade process, such as maintenance and decontamination operations, are provided by several ancillary facilities at the PORTS site. These facilities include the X-700 Maintenance Facility, the X-705 Decontamination Facility, and the X-720 Maintenance and Stores Facility. As uranium operations are performed within these facilities, the potential for a criticality accident exists. In the event of a criticality accident within one of these facilities at PORTS, a Criticality Accident Alarm System (CAAS) is in place to detect the criticality accident and sound an alarm. In this report, an analysis was performed to provide verification that the existing CAAS at PORTS provides complete criticality accident coverage in the X-700, X-705, and X-720 facilities. The analysis has determined that the X-705 and X-720 facilities have complete CAAS coverage; the X-700 facility has not been shown to have complete CAAS coverage at this time.

  7. Task analysis of nuclear-power-plant control-room crews: project approach methodology

    Microsoft Academic Search

    D. Burgy; C. Lempges; A. Miller; L. Schroeder; H. Van Cott; B. Paramore

    1983-01-01

    A task analysis of nuclear-power-plant control-room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task-analysis methodology used in the project is discussed and compared to traditional task-analysis and job-analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas:

  8. TRAC-BF1/MOD1: An advanced best-estimate computer program for BWR accident analysis: User's guide

    SciTech Connect

    Rettig, W.H.; Wade, N.L. (eds.) (EG and G Idaho, Inc., Idaho Falls, ID (United States))

    1992-06-01

    The TRAC-BWR code development program at the Idaho National Engineering Laboratory has developed versions of the Transient Reactor Analysis Code (TRAC) for the US Nuclear Regulatory Commission and the public. The TRAC-BF1/MODI version of the computer code provides a best-estimate analysis capability for analyzing the full range of postulated accidents in boiling water reactor (BWR) systems and related facilities. This version provides a consistent and unified analysis capability for analyzing all areas of a large- or small-break loss-of-coolant accident (LOCA), beginning with the blowdown phase and continuing through heatup, reflood with quenching, and, finally, the refill phase of the accident. Also provided is a basic capability for the analysis of operational transients up to and including anticipated transients without scram (ATWS). The TRAC-BF1/MOD1 version produces results consistent with previous versions. Assessment calculations using the two TRAC-BFI versions show overall improvements in agreement with data and computation times as compared to earlier versions of the TRAC-BWR series of computer codes.

  9. Severe accident analysis of TMI-1 seal LOCA scenario using MELCOR 1.8.2

    SciTech Connect

    Alammar, M.A. [GPU Nuclear Co., Parsippany, NJ (United States)

    1994-12-31

    The pump seal LOCA scenario for Three Mile Island Unit 1 Nuclear Power Plant is analyzed using the NORC`s MELCOR 1.8.2 Code. This scenario was a major contributor to containment failure for the LOCA group as a result of the IPE Level 2 analysis which was done using EPRI`s MAAP3B code. The main purpose of this paper is to see if conclusions would have been different with regard to the impact of this scenario on containment performance had MELCOR been used instead of MAAP3B. The major areas addressed were in-vessel and ex-vessel phenomena. For the in-vessel part, three major stages of a severe accident were investigated, namely (1) thermal-hydraulic behavior before core uncovers; (2) core heatup, relocation, and hydrogen generation; and (3) lower head failure. For the ex-vessel part, the following were addressed: (1) Corium-concrete interaction; (2) containment failure; and (3) source term release. It is shown that the same conclusions are made with regard to containment performance and its impact on Level 2 results.

  10. Quantifying Drosophila food intake: comparative analysis of current methodology

    PubMed Central

    Deshpande, Sonali A.; Carvalho, Gil B.; Amador, Ariadna; Phillips, Angela M.; Hoxha, Sany; Lizotte, Keith J.; Ja, William W.

    2014-01-01

    Food intake is a fundamental parameter in animal studies. Despite the prevalent use of Drosophila in laboratory research, precise measurements of food intake remain challenging in this model organism. Here, we compare several common Drosophila feeding assays: the Capillary Feeder (CAFE), food-labeling with a radioactive tracer or a colorimetric dye, and observations of proboscis extension (PE). We show that the CAFE and radioisotope-labeling provide the most consistent results, have the highest sensitivity, and can resolve differences in feeding that dye-labeling and PE fail to distinguish. We conclude that performing the radiolabeling and CAFE assays in parallel is currently the best approach for quantifying Drosophila food intake. Understanding the strengths and limitations of food intake methodology will greatly advance Drosophila studies of nutrition, behavior, and disease. PMID:24681694

  11. Aerodynamic configuration design using response surface methodology analysis

    NASA Technical Reports Server (NTRS)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

    1993-01-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  12. Methodologic research needs in environmental epidemiology: data analysis.

    PubMed Central

    Prentice, R L; Thomas, D

    1993-01-01

    A brief review is given of data analysis methods for the identification and quantification of associations between environmental exposures and health events of interest. Data analysis methods are outlined for each of the study designs mentioned, with an emphasis on topics in need of further research. Particularly noted are the need for improved methods for accommodating exposure assessment measurement errors in analytic epidemiologic studies and for improved methods for the conduct and analysis of aggregate data (ecologic) studies. PMID:8206041

  13. Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J. [Bechtel National, Inc., San Francisco, CA (United States); Laub, T.W. [Sandia National Labs., Albuquerque, NM (United States)

    1992-06-01

    This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

  14. Safety Analysis of Small Break Loss of Coolant Accident for 1200 MWe Simplified Boiling Water Reactor (SBWR-1200 BDLB)

    SciTech Connect

    Xu, Y.; Revankar, S.T.; Ishii, M. [School of Nuclear Engineering, Purdue University, West Lafayette, IN 47907-1290 (United States)

    2002-07-01

    The objective of this research is to assess the performance of the safety systems during small break loss of coolant accident (SBLOCA) transient in the full size SBWR. RELAP5/MOD3 was used to simulate the blow-down and long-term cooling responses of the various safety systems during the accident transient. An integral test for long-term cooling under low pressure was conducted in a scaled facility with the initial conditions given by the code simulation. The code applicability and the facility scalability were evaluated by the comparison between the test data and the code simulations. The scaling analysis has been done by the comparison of the prototype code predictions and the scaled-up test data with the proper scaling multiplications and time shifting. The good agreement between the major safety parameters has shown the applicability of the RELAP5/MOD3 code and the scalability of the facility for SBWR-1200 safety analysis applications. (authors)

  15. Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology

    NASA Technical Reports Server (NTRS)

    Atkins, H. L.

    1997-01-01

    The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

  16. Angular VGA and Cellular VGA An exploratory study for spatial analysis methodology based on human movement behavior

    Microsoft Academic Search

    Minseok Kim; Jaepil Choi

    2009-01-01

    In the Space Syntax field, moving distance and turning angle are the two main factors of human movement behavior which have been taken notice of. Several spatial analysis methodologies which apply those issues to existing graph-based methodologies have been published during the recent ten years. One of the most remarkable models is the Angular Segment Analysis (ASA) methodology for successfully

  17. Prospective Analysis of Factors Associated with Work Reentry in Patients with Accident-Related Injuries

    Microsoft Academic Search

    Corinna Lange; Markus Burgmer; Michael Braunheim; Gereon Heuft

    2007-01-01

    Introduction: The objective of this study was to investigate the influence of accidents, the physical and psychological consequences,\\u000a the patient’s predisposition as well as work-related cognitions on return to work (RTW) post accident. Despite the costs of\\u000a time-off from work after accidental injuries, very few investigations have been carried out so far. Method: In a consecutive sample, 163 patients were

  18. Sleep, watchkeeping and accidents: a content analysis of incident at sea reports

    Microsoft Academic Search

    Richard Phillips

    2000-01-01

    The unique profession of seafaring involves rest and sleep in a 24-h-a-day work environment that usually involves time-zone crossings, noise, heat, cold and motion. Sleep under such conditions is often difficult to obtain, and sleeping and sleep loss are often related to fatigue and contributory to accidents. This study aims to determine how accident investigators report sleep in Incident at

  19. The first steps towards a standardized methodology for CSP electricity yield analysis.

    SciTech Connect

    Wagner, Michael (National Renewable Energy Laboratories, Golden, CO); Hirsch, Tobias (German Aerospace Center (DLR), Institute of Technical Thermodynamics, Stuttgart,Germany); Benitez, Daniel (Flagsol, Cologne, Germany); Eck, Markus (German Aerospace Center (DLR), Institute of Technical Thermodynamics, Stuttgart,Germany); Ho, Clifford Kuofei

    2010-08-01

    The authors have founded a temporary international core team to prepare a SolarPACES activity aimed at the standardization of a methodology for electricity yield analysis of CSP plants. This core team has drafted a structural framework for a standardized methodology and the standardization process itself. The structural framework has to assure that the standardized methodology is applicable to all conceivable CSP systems, can be used on all levels of the project development process and covers all aspects affecting the electricity yield of CSP plants. Since the development of the standardized methodology is a complex task, the standardization process has been structured in work packages, and numerous international experts covering all aspects of CSP yield analysis have been asked to contribute to this process. These experts have teamed up in an international working group with the objective to develop, document and publish standardized methodologies for CSP yield analysis. This paper summarizes the intended standardization process and presents the structural framework of the methodology for CSP yield analysis.

  20. Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.

    PubMed

    Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

    2005-01-01

    A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

  1. Methodology for computer aided fuzzy fault tree analysis

    Microsoft Academic Search

    Refaul Ferdous; Faisal Khan; Brian Veitch; Paul R. Amyotte

    2009-01-01

    Probabilistic risk assessment (PRA) is a comprehensive, structured and logical analysis method aimed at identifying and assessing risks of complex process systems. PRA uses fault tree analysis (FTA) as a tool to identify basic causes leading to an undesired event, to represent logical dependency of these basic causes in leading to the event, and finally to calculate the probability of

  2. On the Application of Syntactic Methodologies in Automatic Text Analysis.

    ERIC Educational Resources Information Center

    Salton, Gerard; And Others

    1990-01-01

    Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

  3. Development of methodology for horizontal axis wind turbine dynamic analysis

    NASA Technical Reports Server (NTRS)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  4. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  5. Numerical estimation methodology for RFID\\/Active Implantable Medical Device-EMI based upon FDTD analysis

    Microsoft Academic Search

    Takashi Hikage; Yoshifumi Kawamura; Toshio Nojima

    2011-01-01

    A numerical estimation methodology for RFID \\/Active Implantable Medical Device (AIMD) EMI based upon FDTD analysis is presented. This methodology can be applied to low-band RFID. In this paper, an example for RFID interrogator in the frequency band of 13.56 MHz (ISO\\/IEC 18000-3 MODE 1) is shown. It assumes that RFID interrogators operating in the low or HF frequency bands

  6. The impact of methodological factors on child psychotherapy outcome research: A meta-analysis for researchers

    Microsoft Academic Search

    Bahr Weiss; John R. Weisz

    1990-01-01

    Two recent meta-analyses have generated evidence for child and adolescent psychotherapy effects. However, critics note that such meta-analyses often include studies with methodological shortcomings which might invalidate their results. In the present study, we explored whether the results of the most extensive child\\/adolescent meta-analysis might have been influenced by such methodological variables, focusing on internal validity and external validity factors.

  7. Progressive Failure Analysis Methodology for Laminated Composite Structures

    NASA Technical Reports Server (NTRS)

    Sleight, David W.

    1999-01-01

    A progressive failure analysis method has been developed for predicting the failure of laminated composite structures under geometrically nonlinear deformations. The progressive failure analysis uses C(exp 1) shell elements based on classical lamination theory to calculate the in-plane stresses. Several failure criteria, including the maximum strain criterion, Hashin's criterion, and Christensen's criterion, are used to predict the failure mechanisms and several options are available to degrade the material properties after failures. The progressive failure analysis method is implemented in the COMET finite element analysis code and can predict the damage and response of laminated composite structures from initial loading to final failure. The different failure criteria and material degradation methods are compared and assessed by performing analyses of several laminated composite structures. Results from the progressive failure method indicate good correlation with the existing test data except in structural applications where interlaminar stresses are important which may cause failure mechanisms such as debonding or delaminations.

  8. Methodology for statistical analysis of SENCAR mouse skin assay data.

    PubMed Central

    Stober, J A

    1986-01-01

    Various response measures and statistical methods appropriate for the analysis of data collected in the SENCAR mouse skin assay are examined. The characteristics of the tumor response data do not readily lend themselves to the classical methods for hypothesis testing. The advantages and limitations of conventional methods of analysis and methods recommended in the literature are discussed. Several alternative response measures that were developed specifically to answer the problems inherent in the data collected in the SENCAR bioassay system are described. These measures take into account animal survival, tumor multiplicity, and tumor regression. Statistical methods for the analysis of these measures to test for a positive dose response and a dose-response relationship are discussed. Sample data from representative initiation/promotion studies are used to compare the response measures and methods of analysis. PMID:3780632

  9. Analysis and Design of Fuselage Structures Including Residual Strength Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Knight, Norman F.

    1998-01-01

    The goal of this research project is to develop and assess methodologies for the design and analysis of fuselage structures accounting for residual strength. Two primary objectives are included in this research activity: development of structural analysis methodology for predicting residual strength of fuselage shell-type structures; and the development of accurate, efficient analysis, design and optimization tool for fuselage shell structures. Assessment of these tools for robustness, efficient, and usage in a fuselage shell design environment will be integrated with these two primary research objectives.

  10. A Rayleigh-Ritz analysis methodology for cutouts in composite structures

    NASA Technical Reports Server (NTRS)

    Russell, Steven G.

    1991-01-01

    A new Rayleigh-Ritz stress analysis methodology that was developed for composite panels containing cutouts is described. The procedure, which makes use of a general assumed displacement field, accommodates circular and elliptical cutouts in biaxially loaded rectangular composite panels. Symmetric integral padups around the cutout can be included in the analysis. Benchmark results are presented to demonstrate the accuracy of the technique. Strength predictions based on the average stress criterion are generated and compared with experimental data. Finally, the stress analysis methodology is integrated into a design procedure for sizing integral padups around circular cutouts, and a sample problem is solved to illustrate its use.

  11. Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis

    NASA Technical Reports Server (NTRS)

    Babcock, P.; Schor, A.; Rosch, G.

    1998-01-01

    This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.

  12. Breath Analysis in Disease Diagnosis: Methodological Considerations and Applications

    PubMed Central

    Lourenço, Célia; Turner, Claire

    2014-01-01

    Breath analysis is a promising field with great potential for non-invasive diagnosis of a number of disease states. Analysis of the concentrations of volatile organic compounds (VOCs) in breath with an acceptable accuracy are assessed by means of using analytical techniques with high sensitivity, accuracy, precision, low response time, and low detection limit, which are desirable characteristics for the detection of VOCs in human breath. “Breath fingerprinting”, indicative of a specific clinical status, relies on the use of multivariate statistics methods with powerful in-built algorithms. The need for standardisation of sample collection and analysis is the main issue concerning breath analysis, blocking the introduction of breath tests into clinical practice. This review describes recent scientific developments in basic research and clinical applications, namely issues concerning sampling and biochemistry, highlighting the diagnostic potential of breath analysis for disease diagnosis. Several considerations that need to be taken into account in breath analysis are documented here, including the growing need for metabolomics to deal with breath profiles. PMID:24957037

  13. Survival analysis methodology to predict the shelf-life of probiotic flavored yogurt

    Microsoft Academic Search

    Adriano G. Cruz; Eduardo H. M. Walter; Rafael Silva Cadena; José A. F. Faria; Helena M. A. Bolini; Hidelte P. Pinheiro; Anderson S. Sant’Ana

    2010-01-01

    The feasibility of survival analysis methodology was used to determine the shelf-life of probiotic strawberry flavored yogurt supplemented with Bifidobacteirum animalis DN 173010 W was investigated. The quality parameters of probiotic yogurts were related to storage conditions which they are submitted. The consumers were shown sensitive to changes towards sensory characteristics introduced into the products. Using the survival analysis and

  14. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    ERIC Educational Resources Information Center

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  15. A methodology for the structural and functional analysis of signaling and regulatory networks

    Microsoft Academic Search

    Steffen Klamt; Julio Saez-rodriguez; Jonathan A. Lindquist; Luca Simeoni; Ernst Dieter Gilles

    2006-01-01

    Background: Structural analysis of cellular interaction networks contributes to a deeper understanding of network-wide interdependencies, causal relationships, and basic functional capabilities. While the structural analysis of metabolic networks is a well-established field, similar methodologies have been scarcely developed and applied to signaling and regulatory networks. Results: We propose formalisms and methods, relying on adapted and partially newly introduced approaches, which

  16. HACCP methodology implementation of meat pâté hazard analysis in pork butchery

    Microsoft Academic Search

    G. Poumeyrol; P. Rosset; V. Noel; E. Morelli

    2010-01-01

    This paper sets out a bacterial hazard analysis methodology, based on the ISO 22000 standard, which could be adopted by small food manufacturers. The paper provides a practical example: meat pâté prepared by pork butchers. The results of the hazard analysis showed that many bacterial hazards, particularly Listeria monocytogenes, Salmonella and Staphylococcus aureus could be effectively controlled by good hygiene

  17. SEM-based methodology for root cause analysis of wafer edge and bevel defects

    Microsoft Academic Search

    Ronnie Porat; Kfir Dotan; Shirley Hemar; Lior Levin; Ken Li; George Sung

    2008-01-01

    Monitoring detectivity of the wafer edge, bevel and apex - the areas beyond the pattern - is becoming increasingly important in the yield enhancement efforts of high-end fabs. In this paper we present a methodology for root cause analysis of edge and bevel defects, based on inline SEM review and EDX-based material analysis.

  18. A Fast TCAD-based Methodology for Variation Analysis of Emerging Nano-Devices

    E-print Network

    Candea, George

    A Fast TCAD-based Methodology for Variation Analysis of Emerging Nano-Devices Hassan Ghasemzadeh--Variability analysis of nanoscale transistors and circuits is emerging as a necessity at advanced technology nodes. Technology Computer Aided Design (TCAD) tools are powerful ways to get an accurate insight of Process

  19. Expert opinion in risk analysis: The NUREG-1150 methodology

    SciTech Connect

    Hora, S.C.; Iman, R.L.

    1988-01-01

    The Reactor Risk Reference Document (US Nuclear Regulatory Commission, 1987) is the most comprehensive study and application of probabilistic risk analysis and uncertainty analysis methods for nuclear power generation safety since the Reactor Safety Study (US Nuclear Regulatory Commission, 1975). Many of the issues addressed in PRA work such as NUREG-1150 involve phenomena that have not been studied through experiment or observation to an extent that makes possible a definitive analysis. In many instances, the rarity or severity of the phenomena make resolution impossible at this time. In these instances, the best available information resides with experts who have studied the phenomena in question. This paper is about a reasoned approach to the acquisition of expert opinion for use in PRA work and other public policy areas.

  20. Accident analysis of large-scale technological disasters applied to an anaesthetic complication.

    PubMed

    Eagle, C J; Davies, J M; Reason, J

    1992-02-01

    The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

  1. Analysis of 121 fatal passenger car-adult pedestrian accidents in China.

    PubMed

    Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

    2014-10-01

    To study the characteristics of fatal vehicle-pedestrian accidents in China?a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

  2. Landscape Equivalency Analysis: Methodology for Estimating Spatially Explicit Biodiversity Credits

    E-print Network

    Lupi, Frank

    Metapopulation genetic theory; Habitat Equivalency Analysis; Conservation banking; Sprawl Published online August. Conservation banking is a means to manage this balance, and we argue for its use to mitigate the effects conservation banking credits so that habitat trades do not exacerbate regional ecological effects of local

  3. Multifractal analysis and ?-stable processes: a methodological contribution

    Microsoft Academic Search

    Pierre Chainais; Patrice Abry; Darryl Veitch

    2000-01-01

    This work is a contribution to the analysis of the procedure, based on wavelet coefficient partition functions, commonly used to estimate the Legendre multifractal spectrum. The procedure is applied to two examples, a fractional Brownian motion in multifractal time and a self-similar ?-stable process, whose sample paths exhibit irregularities that by eye appear very close. We observe that, for the

  4. Social Judgment Analysis: Methodology for Improving Interpersonal Communication and Understanding.

    ERIC Educational Resources Information Center

    Rohrbaugh, John; Harmon, Joel

    Research has found the Social Judgment Analysis (SJA) approach, with its focus on judgment policy and cognitive feedback, to be a significant factor in developing group member agreement and improving member performance. A controlled experiment was designed to assess the relative quality of the judgment making process provided by SJA.…

  5. A fault injection analysis of Virtex FPGA TMR design methodology

    Microsoft Academic Search

    F. Lima; C. Carmichaell; J. Fabula; R Padovanil; R. Reis

    2001-01-01

    This paper presents the meaningful results of a single bit upset fault injection analysis performed in Virtex FPGA triple modular redundancy (TMR) design. Each programmable bit upset able to cause an error in the TMR design has been investigated. Final conclusion using the TMR \\

  6. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

  7. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi-Square test indicated that there was no significant difference in the number of accidents among the different certification categories when either Controlled Flight into Terrain or Structural Failure was listed as cause. However, there was a significant difference in the frequency of accidents with regard to Loss of Control and Engine Failure accidents. The results of the ANCOVA test indicated that there was no significant difference in the accident rate with regard to Loss of Control, Controlled Flight into Terrain, or Structural Failure accidents. There was, however, a significant difference in Engine Failure accidents between Experimental-Amateur Built and the other categories.The text mining analysis of the narrative causes of Loss of Control accidents indicated that only the Civil Air Regulations 3 category airplanes had clusters of words associated with visual flight into instrument meteorological conditions. Civil Air Regulations 3 airplanes were designed and manufactured prior to the 1960s and in most cases have not been retrofitted to take advantage of newer technologies that could help prevent Loss of Control accidents. The study indicated that General Aviation aircraft certification rules do not have a statistically significant effect on aircraft accidents except for Loss of Control and Engine Failure. According to the literature, government oversight could have become an obstacle in the implementation of safety enhancing equipment that could reduce Loss of Control accidents. Oversight should focus on ensuring that Experimental-Amateur Built aircraft owners perform a functional test that could prevent some of the Engine Failure accidents.

  8. Analysis of main steam isolation valve leakage in design basis accidents using MELCOR 1.8.6 and RADTRAD.

    SciTech Connect

    Salay, Michael (United States Nuclear Regulatory Commission, Washington, D.C.); Kalinich, Donald A.; Gauntt, Randall O.; Radel, Tracy E.

    2008-10-01

    Analyses were performed using MELCOR and RADTRAD to investigate main steam isolation valve (MSIV) leakage behavior under design basis accident (DBA) loss-of-coolant (LOCA) conditions that are presumed to have led to a significant core melt accident. Dose to the control room, site boundary and LPZ are examined using both approaches described in current regulatory guidelines as well as analyses based on best estimate source term and system response. At issue is the current practice of using containment airborne aerosol concentrations as a surrogate for the in-vessel aerosol concentration that exists in the near vicinity of the MSIVs. This study finds current practice using the AST-based containment aerosol concentrations for assessing MSIV leakage is non-conservative and conceptually in error. A methodology is proposed that scales the containment aerosol concentration to the expected vessel concentration in order to preserve the simplified use of the AST in assessing containment performance under assumed DBA conditions. This correction is required during the first two hours of the accident while the gap and early in-vessel source terms are present. It is general practice to assume that at {approx}2hrs, recovery actions to reflood the core will have been successful and that further core damage can be avoided. The analyses performed in this study determine that, after two hours, assuming vessel reflooding has taken place, the containment aerosol concentration can then conservatively be used as the effective source to the leaking MSIV's. Recommendations are provided concerning typical aerosol removal coefficients that can be used in the RADTRAD code to predict source attenuation in the steam lines, and on robust methods of predicting MSIV leakage flows based on measured MSIV leakage performance.

  9. Methodology for Sensitivity Analysis, Approximate Analysis, and Design Optimization in CFD for Multidisciplinary Applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1996-01-01

    An incremental iterative formulation together with the well-known spatially split approximate-factorization algorithm, is presented for solving the large, sparse systems of linear equations that are associated with aerodynamic sensitivity analysis. This formulation is also known as the 'delta' or 'correction' form. For the smaller two dimensional problems, a direct method can be applied to solve these linear equations in either the standard or the incremental form, in which case the two are equivalent. However, iterative methods are needed for larger two-dimensional and three dimensional applications because direct methods require more computer memory than is currently available. Iterative methods for solving these equations in the standard form are generally unsatisfactory due to an ill-conditioned coefficient matrix; this problem is overcome when these equations are cast in the incremental form. The methodology is successfully implemented and tested using an upwind cell-centered finite-volume formulation applied in two dimensions to the thin-layer Navier-Stokes equations for external flow over an airfoil. In three dimensions this methodology is demonstrated with a marching-solution algorithm for the Euler equations to calculate supersonic flow over the High-Speed Civil Transport configuration (HSCT 24E). The sensitivity derivatives obtained with the incremental iterative method from a marching Euler code are used in a design-improvement study of the HSCT configuration that involves thickness. camber, and planform design variables.

  10. Probabilistic design analysis using Composite Loads Spectra (CLS) coupled with Probabilistic Structural Analysis Methodologies (PSAM)

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Rajagopal, K. R.; Ho, H.

    1989-01-01

    Composite loads spectra (CLS) were applied to generate probabilistic loads for use in the PSAM nonlinear evaluation of stochastic structures under stress (NESSUS) finite element code. The CLS approach allows for quantifying loads as mean values and distributions around a central value rather than maximum or enveloped values typically used in deterministic analysis. NESSUS uses these loads to determine mean and perturbation responses. These results are probabilistically evaluated with the distributional information from CLS using a fast probabilistic integration (FPI) technique to define response distributions. The main example discussed describes a method of obtaining load descriptions and stress response of the second-stage turbine blade of the Space Shuttle Main Engine (SSME) high-pressure fuel turbopump (HPFTP). Additional information is presented on the on-going analysis of the high pressure oxidizer turbopump discharge duct (HPOTP) where probabilistic dynamic loads have been generated and are in the process of being used for dynamic analysis. Example comparisons of load analysis and engine data are furnished for partial verification and/or justification for the methodology.

  11. The failure analysis of composite material flight helmets as an aid in aircraft accident investigation.

    PubMed

    Caine, Y G; Bain-Ungerson, O; Schochat, I; Marom, G

    1991-06-01

    Understanding why a flying helmet fails to maintain its integrity during an accident can contribute to an understanding of the mechanism of injury and even of the accident itself. We performed a post-accident evaluation of failure modes in glass and aramid fibre-reinforced composite helmets. Optical and microscopic (SEM) techniques were employed to identify specific fracture mechanisms. They were correlated with the failure mode. Stress and energy levels were estimated from the damage extent. Damage could be resolved into distinct impact, flexure and compression components. Delamination was identified as a specific mode, dependent upon the matrix material and bonding between the layers. From the energy dissipated in specific fracture mechanisms we calculated the minimum total energy imparted to the helmet-head combination and the major injury vector (MIV) direction and magnitude. The level of protection provided by the helmet can also be estimated. PMID:1859350

  12. A Comprehensive Analysis of the X-15 Flight 3-65 Accident

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

    2014-01-01

    The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

  13. Statistical theory and methodology for remote sensing data analysis

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1974-01-01

    A model is developed for the evaluation of acreages (proportions) of different crop-types over a geographical area using a classification approach and methods for estimating the crop acreages are given. In estimating the acreages of a specific croptype such as wheat, it is suggested to treat the problem as a two-crop problem: wheat vs. nonwheat, since this simplifies the estimation problem considerably. The error analysis and the sample size problem is investigated for the two-crop approach. Certain numerical results for sample sizes are given for a JSC-ERTS-1 data example on wheat identification performance in Hill County, Montana and Burke County, North Dakota. Lastly, for a large area crop acreages inventory a sampling scheme is suggested for acquiring sample data and the problem of crop acreage estimation and the error analysis is discussed.

  14. Improved finite element methodology for integrated thermal structural analysis

    NASA Technical Reports Server (NTRS)

    Dechaumphai, P.; Thornton, E. A.

    1982-01-01

    An integrated thermal-structural finite element approach for efficient coupling of thermal and structural analysis is presented. New thermal finite elements which yield exact nodal and element temperatures for one dimensional linear steady state heat transfer problems are developed. A nodeless variable formulation is used to establish improved thermal finite elements for one dimensional nonlinear transient and two dimensional linear transient heat transfer problems. The thermal finite elements provide detailed temperature distributions without using additional element nodes and permit a common discretization with lower order congruent structural finite elements. The accuracy of the integrated approach is evaluated by comparisons with analytical solutions and conventional finite element thermal structural analyses for a number of academic and more realistic problems. Results indicate that the approach provides a significant improvement in the accuracy and efficiency of thermal stress analysis for structures with complex temperature distributions.

  15. Thermoluminescence Dose Response: Experimental Methodology, Data Analysis, Theoretical Interpretation

    NASA Astrophysics Data System (ADS)

    Horowitz, Yigal S.; Datz, Hanan

    2011-05-01

    The parameters, Dth, Dc, Dm, f(D) and f(D)max., describing the characteristics of TL dose response are defined and a short survey of the literature concerning the dose response of the major TL glow peaks in LiF:Mg,Ti (TLD-100), peaks 5, 7 and 8 is presented. The experimental parameters and details of the analysis affecting the dose response are outlined. The importance of theoretical interpretation of the dose response in the determination of the dose response parameters is demonstrated and an in-depth introduction to the Unified Interaction Model is described. The dose response as a function of photon energy is analysed for peaks 5, 7 and 8 and the impact of the method of data analysis on the description of f(D) and especially the determination of Dc is emphasized.

  16. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    SciTech Connect

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR.

  17. Experiments on natural circulation during PWR severe accidents and their analysis

    SciTech Connect

    Sehgal, B.R.; Stewart, W.A.; Sha, W.T.

    1988-01-01

    Buoyancy-induced natural circulation flows will occur during the early-part of PWR high pressure accident scenarios. These flows affect several key parameters; in particular, the course of such accidents will most probably change due to local failures occurring in the primary coolant system (CS) before substantial core degradation. Natural circulation flow patterns were measured in a one-seventh scale PWR PCS facility at Westinghouse RandD laboratories. The measured flow and temperature distributions are report in this paper. The experiments were analyzed with the COMMIX code and good agreement was obtained between data and calculations. 10 refs., 8 figs., 2 tabs.

  18. Survey of systems safety analysis methods and their application to nuclear waste management systems

    Microsoft Academic Search

    P. J. Pelto; W. K. Winegardner; R. H. V. Gallucci

    1981-01-01

    This report reviews system safety analysis methods and examines their application to nuclear waste management systems. The safety analysis methods examined include expert opinion, maximum credible accident approach, design basis accidents approach, hazard indices, preliminary hazards analysis, failure modes and effects analysis, fault trees, event trees, cause consequence diagrams, GO methodology, Markov modeling, and a general category of consequence analysis

  19. Bicycle accidents.

    PubMed

    Lind, M G; Wollin, S

    1986-01-01

    Information concerning 520 bicycle accidents and their victims was obtained from medical records and the victims' replies to questionnaires. The analyzed aspects included risk of injury, completeness of accident registrations by police and in hospitals, types of injuries and influence of the cyclists' age and sex, alcohol, fatigue, hunger, haste, physical disability, purpose of cycling, wearing of protective helmet and other clothing, type and quality of road surface, site of accident (road junctions, separate cycle paths, etc.) and turning manoeuvres. PMID:3461642

  20. Occupational accidents aboard merchant ships

    PubMed Central

    Hansen, H; Nielsen, D; Frydenberg, M

    2002-01-01

    Objectives: To investigate the frequency, circumstances, and causes of occupational accidents aboard merchant ships in international trade, and to identify risk factors for the occurrence of occupational accidents as well as dangerous working situations where possible preventive measures may be initiated. Methods: The study is a historical follow up on occupational accidents among crew aboard Danish merchant ships in the period 1993–7. Data were extracted from the Danish Maritime Authority and insurance data. Exact data on time at risk were available. Results: A total of 1993 accidents were identified during a total of 31 140 years at sea. Among these, 209 accidents resulted in permanent disability of 5% or more, and 27 were fatal. The mean risk of having an occupational accident was 6.4/100 years at sea and the risk of an accident causing a permanent disability of 5% or more was 0.67/100 years aboard. Relative risks for notified accidents and accidents causing permanent disability of 5% or more were calculated in a multivariate analysis including ship type, occupation, age, time on board, change of ship since last employment period, and nationality. Foreigners had a considerably lower recorded rate of accidents than Danish citizens. Age was a major risk factor for accidents causing permanent disability. Change of ship and the first period aboard a particular ship were identified as risk factors. Walking from one place to another aboard the ship caused serious accidents. The most serious accidents happened on deck. Conclusions: It was possible to clearly identify work situations and specific risk factors for accidents aboard merchant ships. Most accidents happened while performing daily routine duties. Preventive measures should focus on workplace instructions for all important functions aboard and also on the prevention of accidents caused by walking around aboard the ship. PMID:11850550

  1. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  2. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  3. Analysis of the TMI2 source range monitor during the TMI (Three Mile Island) accident

    Microsoft Academic Search

    Horng-Yu Wu; A. J. Baratta; Ming-Yuan Hsiao; B. R. Bandini

    1987-01-01

    The source range monitor (SRM) data recorded during the first 4 hours of the Three Mile Island Unit No. 2 (TMI-2) accident following reactor shutdown were analyzed. An effort to simulate the actual SRM response was made by performing a series of neutron transport calculations. Primary emphasis was placed on simulating the changes in SRM response to various system events

  4. An Analysis of Incident/Accident Reports from the Texas Secondary School Science Safety Survey, 2001

    ERIC Educational Resources Information Center

    Stephenson, Amanda L.; West, Sandra S.; Westerlund, Julie F.; Nelson, Nancy C.

    2003-01-01

    This study investigated safety in Texas secondary school science laboratory, classroom, and field settings. The Texas Education Agency (TEA) drew a random representative sample consisting of 199 secondary public schools in Texas. Eighty-one teachers completed Incident/Accident Reports. The reports were optional, anonymous, and open-ended; thus,…

  5. Traffic accident in Cuiabá-MT: an analysis through the data mining technology.

    PubMed

    Galvão, Noemi Dreyer; de Fátima Marin, Heimar

    2010-01-01

    The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males. PMID:20841739

  6. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    SciTech Connect

    Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

  7. Independent assessment of MELCOR as a severe accident thermal-hydraulic\\/source term analysis tool

    Microsoft Academic Search

    I. K. Madni; F. Eltawila

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to

  8. A Discrepancy-Based Methodology for Nuclear Training Program Evaluation.

    ERIC Educational Resources Information Center

    Cantor, Jeffrey A.

    1991-01-01

    A three-phase comprehensive process for commercial nuclear power training program evaluation is presented. The discrepancy-based methodology was developed after the Three Mile Island nuclear reactor accident. It facilitates analysis of program components to identify discrepancies among program specifications, actual outcomes, and industry…

  9. Anthropological analysis of taekwondo--new methodological approach.

    PubMed

    Cular, Drazen; Munivrana, Goran; Kati?, Ratko

    2013-05-01

    The aim of this research is to determine the order and importance of impacts of particular anthropological characteristics and technical and tactical competence on success in taekwondo according to opinions of top taekwondo instructors (experts). Partial objectives include analysis of metric characteristics of the measuring instrument, and determining differences between two disciplines (sparring and technical discipline of patterns) and two competition systems (WTF and ITF). In accordance with the aims, the research was conducted on a sample of respondents which consisted of 730 taekwondo instructors from 6 continents and from 69 countries (from which we selected 242 instructors), who are at different success levels in both taekwondo competition systems (styles) and two taekwondo disciplines. The respondents were divided into 3 qualitative subsamples (OST-USP-VRH) using the dependant variable of accomplished results of the instructor. In 6 languages, they electronically evaluated the impact in percentage value (%) of motor and functional skills (MOTFS), morphological characteristics (MORF), psychological profile of an athlete (PSIH), athletic intelligence (INTE) and technical and tactical competence - (TE-TA) on success in taekwondo. The analysis of metric characteristics of the constructed instrument showed a satisfactory degree of agreement (IHr) which is proportional to the level of respondent quality, i.e. it grows along with the increase in instructor quality in all analysed disciplines of both systems. Top instructors assigned the highest portion of impact on success to the motor and functional skills (MOTFS) variable: WTF-SPB=29.1, ITF-SPB=29.2, WTF-THN=35.0, ITF-THN=32.0). Statistically significant differences in opinions of instructors of different styles and disciplines were not recorded in any of the analysed variables. The only exception is the psychological profile of an athlete variable, which WTF instructors of sparring (AM=23.7%), on a significance level of p<0.01, evaluate as having a statistically significantly higher impact on success in tackwondo than WTF instructors of the technical discipline of patterns (15.4%). PMID:23914483

  10. Assessment of methodologies for airborne BaP analysis.

    PubMed

    Piñeiro-Iglesias, M; Grueiro-Noche, G; López-Mahía, P; Muniategui-Lorenzo, S; Prada-Rodríguez, D

    2004-12-01

    Very sensitive analytical methods will be required to assess airborne contaminants with the implementation of new EC Directives. In this work, Soxhlet, ultrasonic and microwave-assisted extraction (MAE) were applied to two airborne standard reference materials (SRM) 1648 "Urban Particulate Matter" and SRM 1649a "Urban Dust". All three techniques afforded satisfactory results, but MAE was preferred due to its low solvent requirement and speed of analysis. In addition, high performance liquid chromatography (HPLC) with ultraviolet and fluorescence detection was compared to gas chromatography (GC)-flame ionisation detection (FID) with programmed split-splitless injection (PSS) and GC-mass spectrometry (MS) with programmed temperature vaporiser (PTV) injection. The HPLC method proved far more sensitive than the GC techniques (four and three orders of magnitude, respectively). Real atmospheric particulate samples were taken at A Coruna (Spain). Different sampling devices were used to collect PM(10), PM(2.5), PM(1), cascade impactor and TSP/gas phase samples. BaP concentrations quantified in the samples are similar to those reported for other cities. PMID:15504523

  11. Landscape equivalency analysis: methodology for estimating spatially explicit biodiversity credits.

    PubMed

    Bruggeman, Douglas J; Jones, Michael L; Lupi, Frank; Scribner, Kim T

    2005-10-01

    We propose a biodiversity credit system for trading endangered species habitat designed to minimize and reverse the negative effects of habitat loss and fragmentation, the leading cause of species endangerment in the United States. Given the increasing demand for land, approaches that explicitly balance economic goals against conservation goals are required. The Endangered Species Act balances these conflicts based on the cost to replace habitat. Conservation banking is a means to manage this balance, and we argue for its use to mitigate the effects of habitat fragmentation. Mitigating the effects of land development on biodiversity requires decisions that recognize regional ecological effects resulting from local economic decisions. We propose Landscape Equivalency Analysis (LEA), a landscape-scale approach similar to HEA, as an accounting system to calculate conservation banking credits so that habitat trades do not exacerbate regional ecological effects of local decisions. Credits purchased by public agencies or NGOs for purposes other than mitigating a take create a net investment in natural capital leading to habitat defragmentation. Credits calculated by LEA use metapopulation genetic theory to estimate sustainability criteria against which all trades are judged. The approach is rooted in well-accepted ecological, evolutionary, and economic theory, which helps compensate for the degree of uncertainty regarding the effects of habitat loss and fragmentation on endangered species. LEA requires application of greater scientific rigor than typically applied to endangered species management on private lands but provides an objective, conceptually sound basis for achieving the often conflicting goals of economic efficiency and long-term ecological sustainability. PMID:16132443

  12. Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis

    PubMed Central

    Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-01-01

    Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942

  13. Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

    2014-05-01

    Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

  14. Nuclear accidents

    NSDL National Science Digital Library

    Iowa Public Television. Explore More Project

    2004-01-01

    Accidents at nuclear power plants can be especially devastating to people and the environment. This article, part of a series about the future of energy, introduces students to nuclear accidents at Chernobyl, Three Mile Island, and Tokaimura. Students explore the incidents by examining possible causes, environmental impacts, and effects on life.

  15. Nuclear accidents

    Microsoft Academic Search

    Mobley

    1982-01-01

    A nuclear accident with radioactive contamination can happen anywhere in the world. Because expert nuclear emergency teams may take several hours to arrive at the scene, local authorities must have a plan of action for the hours immediately following an accident. The site should be left untouched except to remove casualties. Treatment of victims includes decontamination and meticulous wound debridement.

  16. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    PubMed Central

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  17. Methodologies and metrics for the testing and analysis of distributed denial of service attacks and defenses

    Microsoft Academic Search

    Stephen Schwab; Brett Wilson; Roshan Thomas

    2005-01-01

    In this paper, we describe our ongoing efforts to develop methodologies and metrics for the testing and analysis of distributed denial of service (DDoS) attacks and defenses as part of the Evaluation Methods for Internet Security Technologies (EMIST) project funded by the Department of Homeland Security (DHS) and the National Science Foundation (NSF). The EMIST project in turn makes use

  18. Human-Automated Judge Learning: A Methodology for Examining Human Interaction With Information Analysis Automation

    Microsoft Academic Search

    Ellen J. Bass; Amy R. Pritchett

    2008-01-01

    Human-automated judge learning (HAJL) is a methodology providing a three-phase process, quantitative measures, and analytical methods to support design of information analysis automation. HAJL's measures capture the human and automation's judgment processes, relevant features of the environment, and the relationships between each. Specific measures include achievement of the human and the automation, conflict between them, compromise and adaptation by the

  19. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    ERIC Educational Resources Information Center

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  20. Applications of Tutoring Systems in Specialized Subject Areas: An Analysis of Skills, Methodologies, and Results.

    ERIC Educational Resources Information Center

    Heron, Timothy E.; Welsch, Richard G.; Goddard, Yvonne L.

    2003-01-01

    This article reviews how tutoring systems have been applied across specialized subject areas (e.g., music, horticulture, health and safety, social interactions). It summarizes findings, provides an analysis of skills learned within each tutoring system, identifies the respective methodologies, and reports relevant findings, implications, and…

  1. Advanced Research in Artificial Intelligence132 METHODOLOGY FOR LANGUAGE ANALYSIS AND GENERATION

    E-print Network

    Cardeñosa, Jesús

    ACM Classification Keywords: I.2.7 Natural Language Processing Conference: The paper is selected fromAdvanced Research in Artificial Intelligence132 METHODOLOGY FOR LANGUAGE ANALYSIS AND GENERATION be described as mechanic or routine. Literary texts, or generally the natural language, escape to the efforts

  2. Integrated cation–anion\\/volatile fluid inclusion analysis by gas and ion chromatography; methodology and examples

    Microsoft Academic Search

    D. M. DeR Channer; C. J Bray; E. T. C Spooner

    1999-01-01

    Combined gas and ion chromatographic analysis of well characterized, small (?1 g) fluid inclusion-bearing samples is a powerful, but simple, means for obtaining integrated fluid concentrations of major and trace, volatile and ionic fluid constituents without using microthermometrically determined salinity for normalization. The methodology, which is described and assessed in detail, involves crushing a carefully cleaned sample at ?105°C in

  3. A New Agenda in (Critical) Discourse Analysis: Theory, Methodology and Interdisciplinarity

    Microsoft Academic Search

    Li Songqing

    2005-01-01

    As part of the Discourse Approaches to Politics, Society and Culture series, this volume mainly introduces the interdisciplinary and multi-methodological character of (critical) discourse analysis (CDA). Some of the contributions were further developed through discussion at the workshop on 'New Agenda in CDA' that was convened in 2003 at the University of Vienna. The intention of the editors was to

  4. State-of-the-art sustainability analysis methodologies for efficient decision support in green production operations

    Microsoft Academic Search

    Shaofeng Liu; Mike Leat; Melanie Hudson Smith

    2011-01-01

    Over the last three decades, new concepts, strategies, frameworks and systems have been developed to tackle the sustainable development issue. This paper reviews the challenges, perspectives and recent advances in support of sustainable production operations decision-making. The aim of this review is to provide a holistic understanding of advanced scientific analysis methodologies for the evaluation of sustainability, to provide efficient

  5. ELECTRICAL SIMULATION METHODOLOGY DEDICATED TO EMC DIGITAL CIRCUITS EMISSIONS ANALYSIS ON PCB

    E-print Network

    Paris-Sud XI, Université de

    ELECTRICAL SIMULATION METHODOLOGY DEDICATED TO EMC DIGITAL CIRCUITS EMISSIONS ANALYSIS ON PCB Jean Printed Circuit Board (PCB) and some applications for oriented EMC simulations. In a first time, we on test PCB with single digital component are presented to analyse and conclude about different

  6. Methodological Advances in the Analysis of Individual Growth with Relevance to Education Policy.

    ERIC Educational Resources Information Center

    Kaplan, David

    2002-01-01

    Demonstrates how recent methodological developments in the analysis of individual growth can inform important problems in education policy, focusing on growth mixture modeling and applying growth mixture modeling to data from the Early Childhood Longitudinal Study-Kindergarten class of 1998-99 to investigate the effects of full- and part-day…

  7. Applications of the DACUM Occupational Analysis Methodology to Health Occupations Education.

    ERIC Educational Resources Information Center

    O'Brien, Terrance P.

    1989-01-01

    Addresses the potential value of the Developing a Curriculum (DACUM) occupational analysis methodology in curriculum development for health occupations education programs. Presents strengths and weaknesses of using DACUM in the context of traditional procedures and applications to health-related programs. (JOW)

  8. Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys

    Cancer.gov

    Methodological Uses of TUS to Inform Design and Analysis of Tobacco Control Surveys Cristine Delnevo, PhD, MPH UMDNJ-School of Public Health Why is methods research in Tobacco Surveillance important? z Measuring individual behavior over time is crucial

  9. A methodology for root cause analysis of poor performance in fixed-wireless data networks

    Microsoft Academic Search

    Dogu Arifler

    2007-01-01

    In fixed-wireless data networks, poor performance experienced by users, such as excessive delays during file transfers, might be due to a heavily utilized base station or due to the location of the users relative to the base station. A principal component analysis based methodology that may be used by content providers for analyzing the root cause of performance problems is

  10. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    ERIC Educational Resources Information Center

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  11. Development of a new methodology for stability analysis in BWR NPP

    SciTech Connect

    Garcia-Fenoll, M.; Abarca, A.; Barrachina, T.; Miro, R.; Verdu, G. [Inst. for Industrial, Radiophysical and Environmental Safety ISIRYM, Universitat Politecnica de Valencia, Cami de Vera s/n, 46021 Valencia (Spain)

    2012-07-01

    In this work, a new methodology to reproduce power oscillations in BWR NPP is presented. This methodology comprises the modal analysis techniques, the signal analysis techniques and the simulation with the coupled code RELAP5/PARCSv2.7. Macroscopic cross sections are obtained by using the SIMTAB methodology, which is fed up with CASMO-4/SIMULATE-3 data. The input files for the neutronic and thermohydraulic codes are obtained automatically and the thermalhydraulic-to-neutronic representation (mapping) used is based on the fundamental, first and second harmonics shapes of the reactor power, calculated with the VALKIN code (developed in UPV). This mapping was chosen in order not to condition the oscillation pattern. To introduce power oscillations in the simulation a new capability in the coupled code, for generate density perturbations (both for the whole core and for chosen axial levels) according with the power modes shapes, has been implemented. The purpose of the methodology is to reproduce the driving mechanism of the out of phase oscillations appeared in BWR type reactors. In this work, the methodology is applied to the Record 9 point, collected in the NEA benchmark of Ringhals 1 NPP. A set of different perturbations are induced in the first active axial level and the LPRM signals resulting are analyzed. (authors)

  12. [Heliogeophysical factors and aviation accidents].

    PubMed

    Komarov, F I; Oraevski?, V N; Sizov, Iu P; Tsirul'nik, L B; Kanonidi, Kh D; Ushakov, I B; Shalimov, P M; Kimlyk, M V; Glukhov, D V

    1998-01-01

    It was shown by two independent methods that there is a certain correlation between the number of aviation accidents and heliogeophysical factors. The statistical and spectral analyses of time series of heliogeomagnetic factors and the number of aviation accidents in 1989-1995 showed that, of 216 accidents, 58% are related to sudden geomagnetic storms. A similar relation was revealed for aviation catastrophes (64% out of 86 accidents) and emergencies (54% out of 130 accidents) that coincided in time with heliogeomagnetic storms. General periodicities of the series were revealed by the method of spectral analysis, namely, cycles of 30, 42, 46, 64, 74, 83, 99, 115, 143, 169, 339 days, which confirms the causative relation between the number of aviation accidents and heliogeomagnetic factors. It is assumed that some aviation accidents that coincided in time with geomagnetic storms, are due to changes in professional abilities of pilots that were in the zone of storms. PMID:9783079

  13. PWR (Pressurized Water Reactor) interfacing system LOCAs (loss-of-coolant accidents): Analysis of risk reduction alternatives

    SciTech Connect

    Bozoki, G.; Kohut, P.; Fitzpatrick, R.

    1988-01-01

    This analysis suggests that the most cost-effective method to reduce the risk due to Interfacing System Loss of coolant accidents (ISLs) would be to establish a minimum testing frequency for pressure isolation valves. The suggested minimum frequency would be to perform leak testing of the pressure isolation valves at each refueling and after specific individual valve maintenance. In addition, it would also appear that the tests could be performed during descent from power without significantly increasing the risk of an ISL even and effecting considerable cost savings to the utilities.

  14. Hazards and accident analyses, an integrated approach, for the Plutonium Facility at Los Alamos National Laboratory

    SciTech Connect

    Pan, P.Y.; Goen, L.K.; Letellier, B.C.; Sasser, M.K.

    1995-07-01

    This paper describes an integrated approach to perform hazards and accident analyses for the Plutonium Facility at Los Alamos National Laboratory. A comprehensive hazards analysis methodology was developed that extends the scope of the preliminary/process hazard analysis methods described in the AIChE Guidelines for Hazard Evaluations. Results fro the semi-quantitative approach constitute a full spectrum of hazards. For each accident scenario identified, there is a binning assigned for the event likelihood and consequence severity. In addition, each accident scenario is analyzed for four possible sectors (workers, on-site personnel, public, and environment). A screening process was developed to link the hazard analysis to the accident analysis. Specifically the 840 accident scenarios were screened down to about 15 accident scenarios for a more through deterministic analysis to define the operational safety envelope. The mechanics of the screening process in the selection of final scenarios for each representative accident category, i.e., fire, explosion, criticality, and spill, is described.

  15. A substrate noise analysis methodology for large-scale mixed-signal ICs

    Microsoft Academic Search

    Wen Kung Chu; Nishath Verghese; Heayn-Jun Chol; Kenji Shimazaki; Hiroyuki Tsujikawa; Shouzou Hirano; Shirou Doushoh; Makoto Nagata; Atsushi Iwata; Takafumi Ohmoto

    2003-01-01

    A substrate noise analysis methodology is described that simulates substrate noise waveforms at sensitive locations of large-scale mixed-signal ICs. Simulation results for a 7.3 mm×7.3 mm chip with 500 k devices, obtained in a few hours on an engineering server, show good correlation with silicon measurements as testing conditions are varied. An analysis of the substrate and package reveals the

  16. Engaging Students in the Selection and Application of Analysis Methodologies for Scientifically Valid Interpretation of Observations

    NASA Astrophysics Data System (ADS)

    Thiebaux, H.

    2005-12-01

    How can we engage students in scientifically valid interpretation of observations? This presentation describes an approach to teaching/ learning that begins by tapping the "curiosity sets" of the students who are enrolled in a course. Classroom experience should lead to full student-involvement. Examples of final seminar presentations+ by students, whose learning processes were engaged by this method, will be described. (From a course on "Analysis of Spatially-Indexed Observations: keys to the selection, understanding and utilization of spatial analysis methodologies")

  17. Task Analysis Based Methodology for the Design of Face to Face Computer Supported Collaborative Learning Activities

    Microsoft Academic Search

    Maria Francisca Capponi; Miguel Nussbaum; María Ester Lagos

    2006-01-01

    \\u000a This paper shows how Task Analysis can be a powerful tool for the design of collaborative applications supported by wirelessly\\u000a interconnected handhelds. We define a methodology for the design of such activities. It basically consists in performing a\\u000a Task Analysis on an Interaction Model to obtain the set of all possible interactions between actors. Then a class of activities\\u000a is

  18. Safety climate in an automobile manufacturing plant : The effects of work environment, job communication and safety attitudes on accidents and unsafe behaviour

    Microsoft Academic Search

    Sharon Clarke

    2006-01-01

    Purpose – The study aims to examine the safety attitudes of workers, supervisors and managers in a UK-based car manufacturing plant, and their relationship with unsafe behaviour and accidents. Design\\/methodology\\/approach – A questionnaire methodology is used to measure safety attitudes and perceptions. The data are analysed using factor analysis and hierarchical multiple regression. Findings – The factor structure of the

  19. Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.

    SciTech Connect

    Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

    2002-05-01

    This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

  20. Methodology of a combined ground based testing and numerical modelling analysis of supersonic combustion flow paths

    NASA Astrophysics Data System (ADS)

    Hannemann, Klaus; Karl, Sebastian; Martinez Schramm, Jan; Steelant, Johan

    2010-10-01

    In the framework of the European Commission co-funded LAPCAT (Long-Term Advanced Propulsion Concepts and Technologies) project, the methodology of a combined ground-based testing and numerical modelling analysis of supersonic combustion flow paths was established. The approach is based on free jet testing of complete supersonic combustion ramjet (scramjet) configurations consisting of intake, combustor and nozzle in the High Enthalpy Shock Tunnel Göttingen (HEG) of the German Aerospace Center (DLR) and computational fluid dynamics studies utilising the DLR TAU code. The capability of the established methodology is demonstrated by applying it to the flow path of the generic HyShot II scramjet flight experiment configuration.

  1. Third annual Warren K. Sinclair keynote address: retrospective analysis of impacts of the Chernobyl accident.

    PubMed

    Balonov, Mikhail

    2007-11-01

    The accident at the Chernobyl Nuclear Power Plant in 1986 was the most severe in the history of the nuclear industry, causing a huge release of radionuclides over large areas of Europe. The recently completed Chernobyl Forum concluded that after a number of years, along with reduction of radiation levels and accumulation of humanitarian consequences, severe social and economic depression of the affected regions and associated psychological problems of the general public and the workers had become the most significant problem to be addressed by the authorities. The majority of the >600,000 emergency and recovery operation workers and five million residents of the contaminated areas in Belarus, Russia, and Ukraine received relatively minor radiation doses which are comparable with the natural background levels. An exception is a cohort of several hundred emergency workers who received high radiation doses and of whom 28 persons died in 1986 due to acute radiation sickness. Apart from the dramatic increase in thyroid cancer incidence among those exposed to radioiodine at a young age and some increase of leukemia in the most exposed workers, there is no clearly demonstrated increase in the somatic diseases due to radiation. There was, however, an increase in psychological problems among the affected population, compounded by the social disruption that followed the break-up of the Soviet Union. Despite the unprecedented scale of the Chernobyl accident, its consequences on the health of people are far less severe than those of the atomic bombings of the cities of Hiroshima and Nagasaki. Studying the consequences of the Chernobyl accident has made an invaluable scientific contribution to the development of nuclear safety, radioecology, radiation medicine and protection, and also the social sciences. The Chernobyl accident initiated the global nuclear and radiation safety regime. PMID:18049216

  2. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

    2014-03-01

    The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

  4. Heat Transfer Issues in Finite Element Analysis of Bounding Accidents in PPCS Models

    SciTech Connect

    Pampin, R.; Karditsas, P.J. [Culham Science Centre (United Kingdom)

    2005-05-15

    Modelling of temperature excursions in structures of conceptual power plants during hypothetical worst-case accidents has been performed within the European Power Plant Conceptual Study (PPCS). A new, 3D finite elements (FE) based tool, coupling the different calculations to the same tokamak geometry, has been extensively used to conduct the neutron transport, activation and thermal analyses for all PPCS plant models. During a total loss of cooling, the usual assumption for the bounding accident, passive removal of the decay heat from activated materials depends on conduction and radiation heat exchange between components. This paper presents and discusses results obtained during the PPCS bounding accident thermal analyses, examining the following issues: (a) radiation heat exchange between the inner surfaces of the tokamak, (b) the presence of air within the cryostat volume, and the heat flow arising from the circulation pattern provided by temperature differences between various parts, and (c) the thermal conductivity of pebble beds, and its degradation due to exposure to neutron irradiation, affecting the heat transfer capability and thermal response of a blanket based on these components.

  5. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    SciTech Connect

    D. A. Brownson

    2002-09-26

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002).

  6. Analysis of Japanese radionuclide monitoring data of food before and after the Fukushima nuclear accident.

    PubMed

    Merz, Stefan; Shozugawa, Katsumi; Steinhauser, Georg

    2015-03-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima (137)Cs and (90)Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in (137)Cs and vegetarian produce was usually higher in (90)Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of (90)Sr being 10% of the respective (137)Cs concentrations may soon be at risk, as the (90)Sr/(137)Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the (90)Sr content of Japanese foods. PMID:25621976

  7. SACO-1: a fast-running LMFBR accident-analysis code

    SciTech Connect

    Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

    1980-01-01

    SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

  8. Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results

    SciTech Connect

    LAVENDER, J.C.

    2000-10-17

    RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

  9. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    PubMed Central

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Patiño, David; Collazo, Joaquín

    2010-01-01

    Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition. PMID:20559506

  10. Applying sequential injection analysis (SIA) and response surface methodology for optimization of Fenton-based processes

    Microsoft Academic Search

    2009-01-01

    This work presents the use of sequential injection analysis (SIA) and the response surface methodology as a tool for optimization of Fenton-based processes. Alizarin red S dye (C.I. 58005) was used as a model compound for the anthraquinones family, whose pigments have a large use in coatings industry. The following factors were considered: [H2O2]:[Alizarin] and [H2O2]:[FeSO4] ratios and pH. The

  11. A multi-scale segmentation\\/object relationship modelling methodology for landscape analysis

    Microsoft Academic Search

    C. Burnett; Thomas Blaschke

    2003-01-01

    Natural complexity can best be explored using spatial analysis tools based on concepts of landscape as process continuums that can be partially decomposed into objects or patches. We introduce a five-step methodology based on multi-scale segmentation and object relationship modelling. Hierarchical patch dynamics (HPD) is adopted as the theoretical framework to address issues of heterogeneity, scale, connectivity and quasi-equilibriums in

  12. Efficient Full-Wave Method of Moments Analysis and Design Methodology for Radial Line Planar Antennas

    Microsoft Academic Search

    Oleg Becker; Reuven Shavit

    2011-01-01

    An efficient full-wave MoM analysis and a design methodology for radial line planar antennas (RLPA) is presented. The feeding network of the antenna is solved by using the appro- priate Green's function defined for the problem. Filling of the Z-matrices is considerably simplified due to the analytical formu- lation of the MoM. Also, assumptions made on the minimum size of

  13. Novel data-mining methodologies for adverse drug event discovery and analysis.

    PubMed

    Harpaz, R; DuMouchel, W; Shah, N H; Madigan, D; Ryan, P; Friedman, C

    2012-06-01

    An important goal of the health system is to identify new adverse drug events (ADEs) in the postapproval period. Datamining methods that can transform data into meaningful knowledge to inform patient safety have proven essential for this purpose. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used to support ADE discovery and analysis. PMID:22549283

  14. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    SciTech Connect

    Evans, J.S. [Harvard School of Public Health, Boston, MA (United States); Abrahmson, S. [Wisconsin Univ., Madison, WI (United States); Bender, M.A. [Brookhaven National Lab., Upton, NY (United States); Boecker, B.B.; Scott, B.R. [Inhalation Toxicology Research Inst., Albuquerque, NM (United States); Gilbert, E.S. [Battelle Pacific Northwest Lab., Richland, WA (United States)

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  15. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  16. [Analysis of traffic accident fatalities in autopsy material collected in the years 2007-2008 at the Department of Forensic Medicine, Medical University of Bia?ystok].

    PubMed

    Szeremeta, Micha?; Niemcunowicz-Janica, Anna; Sackiewicz, Adam; Ptaszy?ska-Sarosiek, Iwona

    2009-01-01

    The objective of the paper was an analysis of traffic accident fatalities in autopsy material collected at the Department of Forensic Medicine, Medical University of Bia?ystok in the years 2007-2008. The analysis was carried out in 209 traffic accident victims included in the total number of 876 autopsies. Based on autopsy reports, the main criteria included gender, site of death (urban area, rural area, non-built-up), mode of movement (driver, passenger, pedestrian, biker and cyclist), age, location of fatal injuries (head, thorax, abdomen and pelvis, upper and lower limbs, multiorgan injuries) and blood alcohol level. The collected data were analyzed statistically according to the above listed criteria and presented as a text and graphs. In the years 2007-2008, 209 individuals died in traffic accidents in Podlasie Region, with the mortality rate of 24%. Male victims accounted for 76% of fatalities, while females constituted 24%. Drivers predominated among traffic accident fatalities in Podlasie Region, with the mortality rate of 38%. In the years 2007-2008, the mortality rates for pedestrians, passengers, motorbike drivers and bikers were 29%, 26% and 7%, respectively. Regardless of the time period, the fatalities were predominant in non-built-up areas, with the percentage amounting to 48.5%. The location of fatal injuries in respective groups was similar, with a preponderance of multiorgan injuries. The mean age of traffic accident fatalities was 41 years, predominantly due to an increasing number of fatal cases among young individuals. In the years 2007-2008, the percentage of traffic accident victims being under the influence of alcohol was 45% in males and 12% in females. The mean blood alcohol level of traffic accident victims was 2.1% per hundred in males and 1.6% per hundred in females. PMID:20441075

  17. Finite-element methodology for thermal analysis of convectively cooled structures

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A finite-element method for steady-state thermal analysis of convectively cooled structures is presented. The method is based on representing the coolant passages by finite elements with fluid bulk temperature nodes and fluid/structure interface modes. Four finite elements are described: two basic elements (mass transport and surface convection) and two integrated elements for applications to discrete tube and plate-fin convectively cooled structural configurations. Comparative finite-element and lumped-parameter thermal analyses of several convectively cooled structures demonstrate the practicality of utilizing finite-element methodology for thermal analysis of realistic convectively cooled structures.

  18. Radiological health effects models for nuclear power plant accident consequence analysis.

    PubMed

    Evans, J S; Moeller, D W

    1989-04-01

    Improved health effects models have been developed for assessing the early effects, late somatic effects and genetic effects that might result from low-LET radiation exposures to populations following a major accident in a nuclear power plant. All the models have been developed in such a way that the dynamics of population risks can be analyzed. Estimates of life years lost and the duration of illnesses were generated and a framework recommended for summarizing health impacts. Uncertainty is addressed by providing models for upper, central and lower estimates of most effects. The models are believed to be a significant improvement over the models used in the U.S. Nuclear Regulatory Commission's Reactor Safety Study, and they can easily be modified to reflect advances in scientific understanding of the health effects of ionizing radiation. PMID:2925380

  19. Quantitative analysis of the effect of complex internals on LMFBR containment during energetic accidents

    SciTech Connect

    Zeuch, W.R.; Wang, C.Y.

    1985-01-01

    This paper discusses the effects of complex internals on the containment response of large LMFBRs during energetic accidents. Results of a series of analyses with the ALICE-II code demonstrate quantitative structural and hydrodynamic effects from parametric variation of reactor internal designs. Effects of various upper internal structure treatments, structural stiffness of the upper internal structure and core support structure, and the location and dimensions of internal components are examined. Results indicate that reduction of primary containment loads can be accomplished through such means as confinement of the core region and avoiding over-strengthened, rigid internals. A study of the beneficial and adverse parameters involved in primary containment response should be helpful in optimizing designs for safety purposes.

  20. Source terms analysis of a maximum release accident for an AGN-201M reactor

    SciTech Connect

    Brumburgh, G.P.; Heger, A.S. (Univ of New Mexico, Albuquerque (United States))

    1991-01-01

    The fundamental liability of any nuclear reactor is the possibility of exposing the public and environment to an excessive level of nuclear radiation. In a previous paper, the authors addressed the risk and potential vulnerability assessment of a maximum hypothetical release accident (MHRA) for the AGN-201M reactor at the University of New Mexico. The MHRA is defined as the total release of all radiological effluents from the reactor facility to the environment. A level I probabilistic risk assessment was performed to assess the risk to the public. The type of effluents, total activity, maximum exposure rate, and related health effects associated with an MHRA were analyzed in an attempt to identify the source term and its consequences. The source term was characterized for the worst-case scenario only because the magnitude of the released effluents is deemed ineffectual for any subcategory release.

  1. Analysis of risk reduction methods for interfacing system LOCAs (loss-of-coolant accidents) at PWRs

    SciTech Connect

    Bozoki, G.; Kohut, P.; Fitzpatrick, R.

    1988-01-01

    The Reactor Safety Study (WASH-1400) predicted that Interfacing System Loss-of-Coolant Accidents (ISL) events were significant contributors to risk even though they were calculated to be relatively low frequency events. However, there are substantial uncertainties involved in determining the probability and consequences of the ISL sequences. For example, the assumed valve failure modes, common cause contributions and the location of the break/leak are all uncertain and can significantly influence the predicted risk from ISL events. In order to provide more realistic estimates for the core damage frequencies (CDFs) and a reduction in the magnitude of the uncertainties, a reexamination of ISL scenarios at PWRs has been performed by Brookhaven National Laboratory. The objective of this study was to investigate the vulnerability of pressurized water reactor designs to ISLs and identify any improvements that could significantly reduce the frequency/risk of these events.

  2. TRANSIENT ACCIDENT ANALYSIS OF THE GLOVEBOX SYSTEM IN A LARGE PROCESS ROOM

    SciTech Connect

    Lee, S

    2008-01-11

    Local transient hydrogen concentrations were evaluated inside a large process room when the hydrogen gas was released by three postulated accident scenarios associated with the process tank leakage and fire leading to a loss of gas confinement. The three cases considered in this work were fire in a room, loss of confinement from a process tank, and loss of confinement coupled with fire event. Based on these accident scenarios in a large and unventilated process room, the modeling calculations of the hydrogen migration were performed to estimate local transient concentrations of hydrogen due to the sudden leakage and release from a glovebox system associated with the process tank. The modeling domain represented the major features of the process room including the principal release or leakage source of gas storage system. The model was benchmarked against the literature results for key phenomena such as natural convection, turbulent behavior, gas mixing due to jet entrainment, and radiation cooling because these phenomena are closely related to the gas driving mechanisms within a large air space of the process room. The modeling results showed that at the corner of the process room, the gas concentrations migrated by the Case 2 and Case 3 scenarios reached the set-point value of high activity alarm in about 13 seconds, while the Case 1 scenario takes about 90 seconds to reach the concentration. The modeling results were used to estimate transient radioactive gas migrations in an enclosed process room installed with high activity alarm monitor when the postulated leakage scenarios are initiated without room ventilation.

  3. A Gap Analysis Methodology for Collecting Crop Genepools: A Case Study with Phaseolus Beans

    PubMed Central

    Ramírez-Villegas, Julián; Khoury, Colin; Jarvis, Andy; Debouck, Daniel Gabriel; Guarino, Luigi

    2010-01-01

    Background The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis) of crop wild relatives as a means to guide efficient and effective collecting activities. Methodology/Principal Findings The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5%) are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap “hotspots”, representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. Conclusions/Significance Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding). Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources. PMID:20976009

  4. Analysis of loss-of-piping integrity accidents in pool-type LMFBRs using SSC-P

    SciTech Connect

    Khatib-Rahbar, M.; Cazzoli, E.G.; Madni, I.K.

    1981-01-01

    A need therefore exists in this country for a generalized system simulation code which could be used as an available tool for the regulatory body for the safety analysis of pool-type LMFBR plants. For this reason, the SSC-P, a version in the Super System Code series, has been developed. SSC-P is a generalized computer program capable of simulating a variety of operational, incidental and accidental transients with particular emphasis on transients involving plant protection and plant control systems. In this paper, application of the SSC-P code to analyze loss-of-piping integrity (LOPI) accidents in pool-type plants is discussed; some parallel comparisons are made to loop-type systems, using PHENIX and CRBRP systems as reference designs.

  5. Modeling and analysis of core debris recriticality during hypothetical severe accidents in the Advanced Neutron Source Reactor

    SciTech Connect

    Taleyarkhan, R.P.; Kim, S.H.; Slater, C.O.; Moses, D.L.; Simpson, D.B.; Georgevich, V.

    1993-05-01

    This paper discusses salient aspects of severe-accident-related recriticality modeling and analysis in the Advanced Neutron Source (ANS) reactor. The development of an analytical capability using the KENO V.A-SCALE system is described including evaluation of suitable nuclear cross-section sets to account for the effects of system geometry, mixture temperature, material dispersion and other thermal-hydraulic conditions. Benchmarking and validation efforts conducted with KENO V.A-SCALE and other neutronic codes against critical experiment data are described. Potential deviations and biases resulting from use of the 16-group Hansen-Roach library are shown. A comprehensive test matrix of calculations to evaluate the threat of a recriticality event in the ANS is described. Strong dependencies on geometry, material constituents, and thermal-hydraulic conditions are described. The introduction of designed mitigative features is described.

  6. MATADOR: a computer code for the analysis of radionuclide behavior during degraded core accidents in light water reactors

    SciTech Connect

    Baybutt, P.; Raghuram, S.; Avci, H.I.

    1985-04-01

    A new computer code called MATADOR (Methods for the Analysis of Transport And Deposition Of Radionuclides) has been developed to replace the CORRAL computer code which was written for the Reactor Safety Study (WASH-1400). This report contains a detailed description of the models used in MATADOR. MATADOR is intended for use in system risk studies to analyze radionuclide transport and deposition in reactor containments. The principal output of the code is information on the timing and magnitude of radionuclide releases to the environment as a result of severely degraded core accidents. MATADOR considers the transport of radionuclides through the containment and their removal by natural deposition and the operation of engineered safety systems such as sprays. The code requires input data on the source term from the primary system, the geometry of the containment, and the thermal-hydraulic conditions in the containment.

  7. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    PubMed

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition). PMID:24760596

  8. Fully coupled neutronic and hydrodynamic analysis of space nuclear reactor reentry accidents

    NASA Astrophysics Data System (ADS)

    Buksa, J. J.; Elson, J. S.; Kothe, D. B.; McGhee, J. M.; Morel, J. E.; Perry, R. T.; Rider, W. J.

    A method is described for determining the fully coupled neutronic/hydrodynamic response of a space nuclear reactor upon reentry and impact. The code system NIKE-R/PAGOSA is used to accomplish this. This is in contrast to the typical methodology that computes these responses in a decoupled manner. The use of NIKE-R/PAGOSA will allow the seamless description of the phenomena without direct human intervention. This methodology is enabled by the use of modern supercomputers (CM-200). Two examples of this capability are presented: an impaction event modeled with hydrodynamics only (PAGOSA) with the neutron multiplication constant calculated at selected times, and a fully coupled simulation of a reactor immersed in water with its physical disassembly due to its supercritical configuration.

  9. Personal nuclear accident dosimetry at Sandia National Laboratories

    Microsoft Academic Search

    D. C. Ward; A. H. Mohagheghi; R. Burrows

    1996-01-01

    DOE installations possessing sufficient quantities of fissile material to potentially constitute a critical mass, such that the excessive exposure of personnel to radiation from a nuclear accident is possible, are required to provide nuclear accident dosimetry services. This document describes the personal nuclear accident dosimeter (PNAD) used by SNL and prescribes methodologies to initially screen, and to process PNAD results.

  10. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  11. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    Microsoft Academic Search

    J. S. Evans; D. W. Moeller; D. W. Cooper

    1985-01-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear

  12. Transient analysis of LMFBR oxide fuel elements during accidents. 2. Technical report. Final report, 1 July 1973-30 June 1974

    Microsoft Academic Search

    Kastenberg

    1974-01-01

    A major portion of the research and development program for Liquid Metal Fast Breeder Reactors (LMFBR's) is devoted to the behavior of oxide fuel elements. Of particular interest to safety analysis and licensing is the study of transient behavior of these fuel elements under accident conditions. The work reported here centers on (1) understanding the possible location, time, and mode

  13. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    NASA Technical Reports Server (NTRS)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  14. Shipping container response to severe highway and railway accident conditions: Appendices

    SciTech Connect

    Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

    1987-02-01

    Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

  15. Participatory analysis of accidents and incidents as a tool for increasing safety behaviour in fishermen. A pilot intervention study

    Microsoft Academic Search

    Mats Eklöf; Marianne Törner

    2005-01-01

    Although occupational accidents are common in fishery, safety work is often not given priority by the fishermen. The aims of this study were to test a group-based intervention for increased activity in safety work through group discussion of accident\\/incident experience; to study occurred incidents\\/accidents and how such events were managed; and to study intervention effects on activity in safety work,

  16. Nuclear accident

    Microsoft Academic Search

    T. Mathews; S. Agrest; G. Borger; M. Lord; W. D. Marbach; W. J. Cook; M. Sheils

    1979-01-01

    A malfunctioning valve at the Three Mile Island power plant in Pennsylvania was the prelude to the worst nuclear accident in U.S. history. Despite assurances that radiation leaked from the plant posed no immediate threat, the population around the plant dwindled as unforced weekend evacuations grew common. Radiation at the power plant site reached 30 mrem\\/hr on March 30. While

  17. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  18. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. (Harvard Univ., Boston, MA (USA). School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  19. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-07-12

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  20. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-09-19

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  1. Supplemental analysis of accident sequences and source terms for waste treatment and storage operations and related facilities for the US Department of Energy waste management programmatic environmental impact statement

    SciTech Connect

    Folga, S.; Mueller, C.; Nabelssi, B.; Kohout, E.; Mishima, J.

    1996-12-01

    This report presents supplemental information for the document Analysis of Accident Sequences and Source Terms at Waste Treatment, Storage, and Disposal Facilities for Waste Generated by US Department of Energy Waste Management Operations. Additional technical support information is supplied concerning treatment of transuranic waste by incineration and considering the Alternative Organic Treatment option for low-level mixed waste. The latest respirable airborne release fraction values published by the US Department of Energy for use in accident analysis have been used and are included as Appendix D, where respirable airborne release fraction is defined as the fraction of material exposed to accident stresses that could become airborne as a result of the accident. A set of dominant waste treatment processes and accident scenarios was selected for a screening-process analysis. A subset of results (release source terms) from this analysis is presented.

  2. Prevention of industrial accidents using Six Sigma approach

    Microsoft Academic Search

    Sanjit Ray; Prasun Das; Bidyut Kr. Bhattacharya

    2011-01-01

    Purpose – The purpose of this paper is to utilize the power of Six Sigma, a disciplined approach to improve quality of product, process or service quality, for accident prevention in the manufacturing industry. Design\\/methodology\\/approach – This paper presents the basic features of DMAIC methodology of Six Sigma and its application for the purpose of accident prevention; illustrates the set

  3. Development of reload safety analysis methodology and code package uncertainty analysis: amplification of statistical bases. Final report. [PWR

    SciTech Connect

    Goldstein, R.

    1982-12-01

    NP-2577 presented the development of a statistical methodology proposed for use with Electric Power Research Institute (EPRI) Reactor Analysis Support Package (RASP). A subset of RASP, consisting of neutronics (ARMP), systems analysis (RETRAN), and thermal-hydraulics (VIPRE) codes, was considered in Pressurized Water Reactor (PWR) applications. This report supplements NP-2577 in amplifying the discussion of the statistical techniques suggested for use with RASP. In addition, further details of the classification of the uncertainty components are presented. Recommendations are made for future prototypical computations using RASP, which involve an effort expanded to include monitoring and protection system setpoint analyses.

  4. Novel methodologies in analysis of small molecule biomarkers and living cells.

    PubMed

    Chen, Yinan; Zhu, Zhenggang; Yu, Yingyan

    2014-10-01

    Enzyme-linked immuno-sorbent assay (ELISA) is widely used for biomarker detection. A good biomarker can distinguish patients from healthy or benign diseases. However, the ELISA method is not suitable for small molecule or trace substance detection. Along with the development of new technologies, an increasing level of biomaterials, especially small molecules, will be identified as novel biomarkers. Quantitative immuno-PCR, chromatography-mass spectrometry, and nucleic acid aptamer are emerging methodologies for detection of small molecule biomarkers, even in living cells. In this review, we focus on these novel technologies and their potential for small molecule biomarkers and living cell analysis. PMID:25119591

  5. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    SciTech Connect

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  6. Transient Accident Analysis of a Supercritical Carbon Dioxide Brayton Cycle Energy Converter Coupled to an Autonomous Lead-Cooled Fast Reactor

    SciTech Connect

    Moisseytsev, Anton; Sienicki, James J. [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, IL 60439 (United States)

    2006-07-01

    The Supercritical Carbon Dioxide (S-CO{sub 2}) Brayton Cycle is a promising advanced alternative to the Rankine saturated steam cycle and recuperated gas Brayton cycle for the energy converters of specific reactor concepts belonging to the U.S. Department of Energy Generation IV Nuclear Energy Systems Initiative. A new plant dynamics analysis computer code has been developed for simulation of the S-CO{sub 2} Brayton cycle coupled to an autonomous, natural circulation Lead-Cooled Fast Reactor (LFR). The plant dynamics code was used to simulate the whole-plant response to accident conditions. The specific design features of the reactor concept influencing passive safety are discussed and accident scenarios are identified for analysis. Results of calculations of the whole-plant response to loss-of-heat sink, loss-of-load, and pipe break accidents are demonstrated. The passive safety performance of the reactor concept is confirmed by the results of the plant dynamics code calculations for the selected accident scenarios. (authors)

  7. Transient accident analysis of a supercritical carbon dioxide Brayton cycle energy converter coupled to an autonomous lead-cooled fast reactor.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J.; Nuclear Engineering Division

    2008-08-01

    The supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle is a promising advanced alternative to the Rankine steam cycle and recuperated gas Brayton cycle for the energy converters of specific reactor concepts belonging to the U.S. Department of Energy Generation IV Nuclear Energy Systems Initiative. A new plant dynamics analysis computer code has been developed for simulation of the S-CO{sub 2} Brayton cycle coupled to an autonomous, natural circulation lead-cooled fast reactor (LFR). The plant dynamics code was used to simulate the whole-plant response to accident conditions. The specific design features of the reactor concept influencing passive safety are discussed and accident scenarios are identified for analysis. Results of calculations of the whole-plant response to loss-of-heat sink, loss-of-load, and pipe break accidents are demonstrated. The passive safety performance of the reactor concept is confirmed by the results of the plant dynamics code calculations for the selected accident scenarios.

  8. Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model''

    SciTech Connect

    Dmitriy Y. Anistratov; Marvin L. Adams; Todd S. Palmer; Kord S. Smith; Kevin Clarno; Hikaru Hiruta; Razvan Nes

    2003-08-04

    OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations.

  9. [Snowboarding accidents].

    PubMed

    Müller, R; Brügger, O; Mathys, R; Stüssi, E

    2000-12-01

    The present review summarises the related literature of the last ten years with request to snowboarding accidents. Sport accident statistics of snowboarding show high and increasing numbers of injuries. Already snowboarding ranks third of all sport accidents in Switzerland. According to the literature the injury risk in snowboarding is twice that of skiing, although the injuries are less serious. About 50% of the injured snowboarders are beginners. Beginners have a higher risk of injury than more advanced snowboarders. Additionally, and a relative large part of them are injured the first day of their snowboard career. The pattern of injury has changed over the last years. Today, injuries to the lower extremities account no longer for more than 50% but are now decreased to about 25%. Nowadays, wrist, knee, ankle, and shoulder are the most frequently injured body parts. It is the authors opinion that with up-to-date protectors, release bindings, and training of falling techniques the numbers and seriousness of injuries could be reduced, in particular injuries to the wrist and ankle. PMID:11199401

  10. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    NASA Technical Reports Server (NTRS)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are presented to show how surface wrinkle progress with increasing tension loads. Antenna reflector surface accuracies were found to be very much dependent on the type and size of the antenna, the reflector surface curvature, reflector membrane supports in terms of spacing of catenaries, as well as the amount of applied load.

  11. Methodological Framework for Analysis of Buildings-Related Programs with BEAMS, 2008

    SciTech Connect

    Elliott, Douglas B.; Dirks, James A.; Hostick, Donna J.

    2008-09-30

    The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official “benefits estimates” for each of its major programs using its Planning, Analysis, and Evaluation (PAE) Team. PAE conducts an annual integrated modeling and analysis effort to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. These estimates are part of EERE’s budget request and are also used in the formulation of EERE’s performance measures. Two of EERE’s major programs are the Building Technologies Program (BT) and the Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports PAE by developing the program characterizations and other market information necessary to provide input to the EERE integrated modeling analysis as part of PAE’s Portfolio Decision Support (PDS) effort. Additionally, PNNL also supports BT by providing line-item estimates for the Program’s internal use. PNNL uses three modeling approaches to perform these analyses. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits using one of those methods: the Building Energy Analysis and Modeling System (BEAMS). BEAMS is a PC-based accounting model that was built in Visual Basic by PNNL specifically for estimating the benefits of buildings-related projects. It allows various types of projects to be characterized including whole-building, envelope, lighting, and equipment projects. This document contains an overview section that describes the estimation process and the models used to estimate energy savings. The body of the document describes the algorithms used within the BEAMS software. This document serves both as stand-alone documentation for BEAMS, and also as a supplemental update of a previous document, Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort, (Elliott et al. 2004b). The areas most changed since the publication of that previous document are those discussing the calculation of lighting and HVAC interactive effects (for both lighting and envelope/whole-building projects). This report does not attempt to convey inputs to BEAMS or the methodology of their derivation.

  12. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    PubMed Central

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  13. Systems Approaches to Animal Disease Surveillance and Resource Allocation: Methodological Frameworks for Behavioral Analysis

    PubMed Central

    Rich, Karl M.; Denwood, Matthew J.; Stott, Alistair W.; Mellor, Dominic J.; Reid, Stuart W. J.; Gunn, George J.

    2013-01-01

    While demands for animal disease surveillance systems are growing, there has been little applied research that has examined the interactions between resource allocation, cost-effectiveness, and behavioral considerations of actors throughout the livestock supply chain in a surveillance system context. These interactions are important as feedbacks between surveillance decisions and disease evolution may be modulated by their contextual drivers, influencing the cost-effectiveness of a given surveillance system. This paper identifies a number of key behavioral aspects involved in animal health surveillance systems and reviews some novel methodologies for their analysis. A generic framework for analysis is discussed, with exemplar results provided to demonstrate the utility of such an approach in guiding better disease control and surveillance decisions. PMID:24348922

  14. [Evidence-based practices published in Brazil: identification and analysis of their types and methodological approches].

    PubMed

    Lacerda, Rúbia Aparecida; Nunes, Bruna Kosar; Batista, Arlete de Oliveira; Egry, Emiko Yoshikawa; Graziano, Kazuko Uchikawa; Angelo, Margareth; Merighi, Miriam Aparecida Barbosa; Lopes, Nadir Aparecida; Fonseca, Rosa Maria Godoy Serpa da; Castilho, Valéria

    2011-06-01

    This is an integrative review of Brazilian studies on evidence-based practices (EBP) in health, published in ISI/JCR journals in the last 10 years. The aim was to identify the specialty areas that most accomplished these studies, their foci and methodological approaches. Based on inclusion criteria, 144 studies were selected. The results indicate that most EBP studies addressed childhood and adolescence, infectious diseases, psychiatrics/mental health and surgery. The predominant foci were prevention, treatment/rehabilitation, diagnosis and assessment. The most used methods were systematic review with or without meta-analysis, protocol review or synthesis of available evidence studies, and integrative review. A strong multiprofessional expansion of EBP is found in Brazil, contributing to the search for more selective practices by collecting, recognizing and critically analyzing the produced knowledge. The study also contributes to the analysis itself of ways to do research and new research possibilities. PMID:21710089

  15. The definitive analysis of the Bendandi's methodology performed with a specific software

    NASA Astrophysics Data System (ADS)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  16. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans.

    PubMed

    Bernaldo de Quirós, Yara; González-Díaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  17. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    NASA Astrophysics Data System (ADS)

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

    2011-12-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  18. Methodology for cost analysis of film-based and filmless portable chest systems

    NASA Astrophysics Data System (ADS)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  19. Assessment of the nursing skill mix in Mozambique using a task analysis methodology

    PubMed Central

    2014-01-01

    Background The density of the nursing and maternal child health nursing workforce in Mozambique (0.32/1000) is well below the WHO minimum standard of 1 nurse per 1000. Two levels of education were being offered for both nurses and maternal child health nurses, in programmes ranging from 18 to 30 months in length. The health care workforce in Mozambique also includes Medical Technicians and Medical Agents, who are also educated at either basic or mid-level. The Ministry of Health determined the need to document the tasks that each of the six cadres was performing within various health facilities to identify gaps, and duplications, in order to identify strategies for streamlining workforce production, while retaining highest educational and competency standards. The methodology of task analysis (TA) was used to achieve this objective. This article provides information about the TA methodology, and selected outcomes of the very broad study. Methods A cross-sectional descriptive task analysis survey was conducted over a 15 month period (2008–2009). A stratified sample of 1295 individuals was recruited from every type of health facility in all of Mozambique’s 10 provinces and in Maputo City. Respondents indicated how frequently they performed any of 233 patient care tasks. Data analysis focused on identifying areas where identical tasks were performed by the various cadres. Analyses addressed frequency of performance, grouped by level of educational preparation, within various types of health facilities. Results Task sharing ranged from 74% to 88% between basic and general nurse cadres and from 54% to 88% between maternal and child health nurse cadres, within various health facility types. Conversely, there was distinction between scope of practice for nursing and maternal/child health nursing cadres. Conclusion The educational pathways to general nursing and maternal/child health nursing careers were consolidated into one 24 month programme for each career. The scopes of practice were affirmed based on task analysis survey data. PMID:24460789

  20. Assessment of intestinal absorption: A methodology based on stable isotope adminstration and proton activation analysis

    SciTech Connect

    Cantone, M.C. [Univ. of Milano (Italy)

    1994-12-31

    The interest in biokinetic studies is driven by problems related to the physiopathology of oligoelements, chemical elemental pollution and radioactive release in case of nuclear accidents. The application of stable isotopes as tracers in studies of trace elements in the area of nutritional and food science is particularly attractive and specifically if considering the investigations on the most radiosensitive age groups of the population and the repeated studies on healthy people for the assessment of the bioavailability of different compounds. A tracer method based on stable isotope administration, which combines the simultaneous use of two tracers and proton activation analysis is presented. A study aimed to obtain molybdenum biokinetic data in humans was performed. One tracer ({sup 96}Mo) was orally administered and another ({sup 95}Mo) was intravenously injected to two fasting volunteer subjects. Venous blood samples were withdrawn at different postinjection times. The concentration in plasma for both the isotopes was determined by measuring the intensities of the gamma-lines from the technetium radioisotopes produced via (p,n) reactions. In the adopted experimental conditions a minimum detectable concentration of 2 ng isotope/ml plasma was attained. The parameters describing molybdenum kinetics were obtained for the two individuals. Moreover, the investigation was repeated with different tracer amounts for one of the two subjects, in both fasting and non-fasting condition.

  1. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    NASA Technical Reports Server (NTRS)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.

  2. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information (sensitivity result) to reduce sampling number. (4) Allowing grid independence for scaled integral effect test (IET) simulation and real plant applications: (a) eliminate numerical uncertainty on scaling; (b) reduce experimental cost by allowing smaller scaled IET; (c) eliminate user effects. This paper will review the issues related to the current CSAU, introduce FSA, discuss a potential Q-PIRT process, and show simple examples to perform FSA. Finally, the general research direction and requirements to use FSA in a system analysis code will be discussed.

  3. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    SciTech Connect

    NONE

    1999-08-31

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity analysis. These steps are described below.

  4. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  5. Efficient methodology for the transient and periodic steady-State analysis of the synchronous Machine using a phase coordinates model

    Microsoft Academic Search

    Osvaldo Rodríguez; Aurelio Medina

    2004-01-01

    This paper introduces an efficient time-domain methodology for the transient and periodic steady-state analysis of the synchronous machine. A state space model of the synchronous machine in phase coordinates is used to accurately represent its transient and steady-state behavior. In the proposed methodology, the transient behavior of the machine is reproduced in detail and once the fault is removed, a

  6. Domestic accidents: their cause and prevention

    PubMed Central

    Mackessack-Leitch, K.

    1978-01-01

    The study of domestic accidents, which includes accidents in and around the home and in institutions, is of increasing importance. The mortality statistics are shown in Table 1. In 1974, 18,335 people died from accidents in the UK (RoSPA, 1974) equivalent to the population of a reasonably sized town. Accidents form one of the four main causes of death in this country and have become relatively more common in recent years. Analysis of the causes of home accidents make it possible to plan ways of preventing them. General practitioners and their colleagues in the primary health care team have the principal responsibility. PMID:553168

  7. MELPROG-POW/MOD1: A two-dimensional, mechanistic code for analysis of reactor core melt progression and vessel attack under severe accident conditions

    SciTech Connect

    Dosanjh, S.S. (ed.)

    1989-05-01

    The US Nuclear regulatory Commission has made the development of mechanistic models for severe accident progression a major priority. The purpose of these models is to provide detailed, best-estimate, coupled analyses of all the major phenomena involved in the reactor vessel and coolant system in the course of the accident. To meet this objective, the MELPROG computer code is being developed. This report describes the two-dimensional, pressurized water reactor (PWR) version of the MELPROG computer code, MELPROG/PWR-MOD1. Preliminary BWR work is described in this report. MELPROG is coupled to the TRAC-PF-1 RCS thermal-hydraulics code to provide an integrated analysis of the behavior of core, vessel, and reactor coolant systems during severe accidents. MELPROG treats core degradation and loss of geometry, debris formation, core melting, attack on supporting structures, slumping, melt/water interactions and vessel failure. The key element in MELPROG is the use of detailed modeling for the entire damage progression and failure sequence. Emphasis is also placed on the rates of hydrogen, steam and fission product formation, and transport to containment during the accident.

  8. The U-tube sampling methodology and real-time analysis of geofluids

    SciTech Connect

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-03-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood [1973], provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO{sub 2} storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO{sub 2} from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO{sub 2} storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  9. Genetic screening for reproductive planning: methodological and conceptual issues in policy analysis.

    PubMed Central

    Asch, D A; Hershey, J C; Pauly, M V; Patton, J P; Jedrziewski, M K; Mennuti, M T

    1996-01-01

    OBJECTIVES: This paper explores several critical assumptions and methodological issues arising in cost-effectiveness analyses of genetic screening strategies in the reproductive setting. METHODS: Seven issues that arose in the development of a decision analysis of alternative strategies for cystic fibrosis carrier screening are discussed. Each of these issues required a choice in technique. RESULTS: The presentations of these analyses frequently mask underlying assumptions and methodological choices. Often there is no best choice. In the case of genetic screening in the reproductive setting, these underlying issues often touch on deeply felt human values. CONCLUSIONS: Space limitations for published papers often preclude explaining such choices in detail; yet these decisions determine the way the results should be interpreted. Those who develop these analyses need to make sure that the implications of important assumptions are understood by the clinicians who will use them. At the same time, clinicians need to enhance their understanding of what these models truly mean and how they address underlying clinical, ethical, and economic issues. PMID:8629720

  10. TRAC-BF1/MOD1: An advanced best-estimate computer program for BWR accident analysis: User`s guide. Volume 2

    SciTech Connect

    Rettig, W.H.; Wade, N.L. [eds.] [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1992-06-01

    The TRAC-BWR code development program at the Idaho National Engineering Laboratory has developed versions of the Transient Reactor Analysis Code (TRAC) for the US Nuclear Regulatory Commission and the public. The TRAC-BF1/MODI version of the computer code provides a best-estimate analysis capability for analyzing the full range of postulated accidents in boiling water reactor (BWR) systems and related facilities. This version provides a consistent and unified analysis capability for analyzing all areas of a large- or small-break loss-of-coolant accident (LOCA), beginning with the blowdown phase and continuing through heatup, reflood with quenching, and, finally, the refill phase of the accident. Also provided is a basic capability for the analysis of operational transients up to and including anticipated transients without scram (ATWS). The TRAC-BF1/MOD1 version produces results consistent with previous versions. Assessment calculations using the two TRAC-BFI versions show overall improvements in agreement with data and computation times as compared to earlier versions of the TRAC-BWR series of computer codes.

  11. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  12. Analysis of Sodium Fire in the Containment Building of Prototype Fast Breeder Reactor Under the Scenario of Core Disruptive Accident

    SciTech Connect

    Rao, P.M.; Kasinathan, N. [Indira Gandhi Centre for Atomic Research, Kalpakkam 603 102 (India); Kannan, S.E. [Atomic Energy Regulatory Board, Niyamak Bhavan, Anushaktinagar, Mumbai 400 094 (India)

    2006-07-01

    The potential for sodium release to reactor containment building from reactor assembly during Core Disruptive Accident (CDA) in Fast Breeder Reactors (FBR) is an important safety issue with reference to the structural integrity of Reactor Containment Building (RCB). For Prototype Fast Breeder Reactor (PFBR), the estimated sodium release under a CDA of 100 MJ energy release is 350 kg. The ejected sodium reacts easily with air in RCB and causes temperature and pressure rise in the RCB. For estimating the severe thermal consequences in RCB, different modes of sodium fires like pool and spray fires were analyzed by using SOFIRE -- II and NACOM sodium fire computer codes. Effects of important parameters like amount of sodium, area of pool, containment air volume and oxygen concentration have been investigated. A peak pressure rise of 7.32 kPa is predicted by SOFIRE II code for 350 kg sodium pool fire in 86,000 m{sup 3} RCB volume. Under sodium release as spray followed by unburnt sodium as pool fire mode analysis, the estimated pressure rise is 5.85 kPa in the RCB. In the mode of instantaneous combustion of sodium, the estimated peak pressure rise is 13 kPa. (authors)

  13. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    SciTech Connect

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.

  14. Probabilistic Climate Forecasting: Methodological issues arising from analysis in climateprediction.net

    NASA Astrophysics Data System (ADS)

    Rowlands, D. J.; Frame, D. J.; Meinshausen, N.; Aina, T.; Jewson, S.; Allen, M. R.

    2009-12-01

    One of the chief goals of climate research is to produce meaningful probabilistic forecasts that can be used in the formation of future policy and adaptation strategies. The current range of methodologies presented in the scientific literature show that this is not an easy task, especially with the various philosophical interpretations of how to combine the information contained in Perturbed-Physics and Multi-Model Ensembles (PPE & MME). The focus of this research is to present some of the methodological issues that have arisen in the statistical analysis of the latest climateprediciton.net experiment, a large PPE of transient simulations using HadCM3L. Firstly we consider model evaluation and propose a method for calculating the Likelihood of each ensemble member based on a transient constraint involving regional temperature changes. We argue that this approach is more meaningful for future climate change projections than climatology based constraints. A further question we consider is which observations to include in our Likelihood function; should we care how well a model performs simulating the climate of Europe if we are producing a forecast for South Africa? The second issue deals with how to combine multiple models from such an ensemble together into a probabilistic forecast. Much has been said about the Bayesian methodology given the sensitivity of forecasts to prior assumptions. For simple models of the climate, with inputs such as climate sensitivity, there may be strong prior information, but for complex climate models where parameters correspond to non-observable quantities this is not so straightforward, and so we may have no reason to believe that a parameter has a uniform distribution or an inverse uniform distribution. We therefore propose two competing methodologies for dealing with this problem, namely Likelihood profiling, and the Jeffreys' prior, which is an approach typically known as OBJECTIVE Bayesian Statistics, where the use of the word objective simply implies that the prior is generated using a rule, rather than from expert opinion. We present novel results using a simple climate model as an illustrative example, with a view to applying these techniques to the full climateprediction.net ensemble.

  15. Experiments and analysis, by the method of characteristics, on loss of coolant accidents

    Microsoft Academic Search

    S. Banerjee; R. B. Jeffries; H. Goulding; T. Jaganathan

    1973-01-01

    An outline of a technique for analysis of LOCAs by the method of ; characteristics and some comparisons with experiments are presented. Prediction ; of coolant density in the heated section, pressures, temperatures, and flow rates ; as functions of time following a pipe rupture are discussed. (GE) The Spanish ; Atomic Forum dedicated the 1972 lectures to the problem

  16. Analysis Methodology for Large Organizations' Investments in Energy Retrofit of Buildings

    E-print Network

    Heo, Y.; Augenbroe, G.

    2011-01-01

    This paper presents a formal methodology that supports large organizations' investments in energy retrofit of buildings. The methodology is a scalable modeling approach based on normative models and Bayesian calibration. Normative models are a light...

  17. New methodology developed for the differential scanning calorimetry analysis of polymeric matrixes incorporating phase change materials

    NASA Astrophysics Data System (ADS)

    Barreneche, Camila; Solé, Aran; Miró, Laia; Martorell, Ingrid; Inés Fernández, A.; Cabeza, Luisa F.

    2012-08-01

    Nowadays, thermal comfort needs in buildings have led to an increase in energy consumption of the residential and service sectors. For this reason, thermal energy storage is shown as an alternative to achieve reduction of this high consumption. Phase change materials (PCM) have been studied to store energy due to their high storage capacity. A polymeric material capable of macroencapsulating PCM was developed by the authors of this paper. However, difficulties were found while measuring the thermal properties of these materials by differential scanning calorimetry (DSC). The polymeric matrix interferes in the detection of PCM properties by DSC. To remove this interfering effect, a new methodology which replaces the conventional empty crucible used as a reference in the DSC analysis by crucibles composed of the polymeric matrix was developed. Thus, a clear signal from the PCM is obtained by subtracting the new full crucible signal from the sample signal.

  18. How does corruption influence perceptions of the risk of nuclear accidents?: cross-country analysis after the 2011 Fukushima disaster in Japan

    Microsoft Academic Search

    Eiji Yamamura

    2011-01-01

    Japan’s 2011 natural disasters were accompanied by a devastating nuclear disaster in Fukushima. This paper used cross-country data obtained immediately after the Japanese disaster to explore how, and the extent to which, corruption affects the perception of citizens regarding the risk of nuclear accidents. Endogeneity bias was controlled for using instrumental variables. The cross-country analysis showed that citizens in less

  19. Radiation accidents.

    PubMed

    Saenger, E L

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity. PMID:3526994

  20. Radiation accidents

    SciTech Connect

    Saenger, E.L.

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity.

  1. Analysis of fission product revaporization in a BWR reactor cooling system during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This report presents a preliminary analysis of fission product revaporization in the Reactor Cooling System (RCS) after the vessel failure. The station blackout transient for BWR Mark I Power Plant is considered. The TRAPMELT3 models of evaporization, chemisorption, and the decay heating of RCS structures and gases are adopted in the analysis. The RCS flow models based on the density-difference between the RCS and containment pedestal region are developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP is developed for the analysis. The REVAP is incorporated with the MARCH, TRAPMELT3 and NAUA codes of the Source Term Code Pack Package (STCP). The NAUA code is used to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors determining the magnitude of revaporization and subsequent release of the volatile fission product. 8 figs., 1 tab.

  2. Reliability Analysis of Hydraulic System for Type Crane Based on Go Methodology

    Microsoft Academic Search

    Heqing Li; Qing Tan

    2009-01-01

    Traditional methodologies have some difficulty in analyzing the reliability of complicated hydraulic system. In the paper, based on GO methodology, a GO model for hydraulic system of 25T type crane is established. Then, according to the model, margin of safety of hydraulic system is obtained using quantitative calculation of GO methodology, and minimal cut sets of hydraulic system failures is

  3. System level signal and power integrity analysis methodology for system-in-package applications

    Microsoft Academic Search

    Rohan Mandrekar; Krishna Bharath; Krishna Srinivasan; Ege Engin; Madhavan Swaminathan

    2006-01-01

    This paper describes a methodology for performing system level signal and power integrity analyses of SiP-based systems. The paper briefly outlines some new modeling and simulation techniques that have been developed to enable the proposed methodology. Some results based on the application of this methodology on test systems are also presented. Categories and Subject Descriptors

  4. Assessment of ISLOCA risk: Methodology and application to a Westinghouse four-loop ice condenser plant

    SciTech Connect

    Kelly, D.L.; Auflick, J.L.; Haney, L.N. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1992-04-01

    Inter-system loss-of-coolant accidents (ISLOCAs) have been identified as important contributors to offsite risk for some nuclear power plants. A methodology has been developed for identifying and evaluating plant-specific hardware designs, human factors issues, and accident consequence factors relevant to the estimation of ISLOCA core damage frequency and risk. This report presents a detailed description of the application of this analysis methodology to a Westinghouse four-loop ice condenser plant. This document also includes appendices A through I which provide: System descriptions; ISLOCA event trees; human reliability analysis; thermal hydraulic analysis; core uncovery timing calculations; calculation of system rupture probability; ISLOCA consequences analysis; uncertainty analysis; and component failure analysis.

  5. A methodology for the semi-automatic digital image analysis of fragmental impactites

    NASA Astrophysics Data System (ADS)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  6. Analysis Methodology of Adaptive Resource Allocation Schemes for Handoff Calls: Combining Equilibrium and Transient Analysis

    E-print Network

    Bahk, Saewoong

    @tsp.snu.ac.kr, sbahk@netlab.snu.ac.kr Abstract--An important QoS issue in wireless networks is how to control handoff drops. There have been several adaptive admission control schemes that dynamically adjust the admission makes the analysis intractable. We assume a basic adap- tive admission control model and propose a novel

  7. Uncertainty in LOCA (loss-of-coolant accident) analysis historical discussion

    SciTech Connect

    Sullivan, L.H.

    1988-01-01

    The Nuclear Regulatory Commission (NRC) Commissioners approved the proposed change to 10 CFR Part 50 Appendix K in July 1988. The change allows reactor vendors to use either the previous Appendix K requirements or a best-estimate analysis with defined uncertainties. This change to Appendix K has led NRC to investigate and the nuclear industry to develop an acceptable method to evaluate the uncertainty in design-base thermal-hydraulic codes. Uncertainty methods suitable for the design-base codes have been investigated by both the NRC and nuclear industry since the late 1970's. This early work provides a good base for the current work and will allow the nuclear industry to use the advantages that the recently approved change to Appendix K offers. 11 refs., 2 figs., 6 tabs.

  8. Factors Associated with Fatal Occupational Accidents among Mexican Workers: A National Analysis

    PubMed Central

    Gonzalez-Delgado, Mery; Gómez-Dantés, Héctor; Fernández-Niño, Julián Alfredo; Robles, Eduardo; Borja, Víctor H.; Aguilar, Miriam

    2015-01-01

    Objective To identify the factors associated with fatal occupational injuries in Mexico in 2012 among workers affiliated with the Mexican Social Security Institute. Methods Analysis of secondary data using information from the National Occupational Risk Information System, with the consequence of the occupational injury (fatal versus non-fatal) as the response variable. The analysis included 406,222 non-fatal and 1,140 fatal injuries from 2012. The factors associated with the lethality of the injury were identified using a logistic regression model with the Firth approach. Results Being male (OR=5.86; CI95%: 4.22-8.14), age (OR=1.04; CI95%: 1.03-1.06), employed in the position for 1 to 10 years (versus less than 1 year) (OR=1.37; CI95%: 1.15-1.63), working as a facilities or machine operator or assembler (OR: 3.28; CI95%: 2.12- 5.07) and being a worker without qualifications (OR=1.96; CI95%: 1.18-3.24) (versus an office worker) were associated with fatality in the event of an injury. Additionally, companies classified as maximum risk (OR=1.90; CI 95%: 1.38-2.62), workplace conditions (OR=7.15; CI95%: 3.63-14.10) and factors related to the work environment (OR=9.18; CI95%:4.36-19.33) were identified as risk factors for fatality in the event of an occupational injury. Conclusions Fatality in the event of an occupational injury is associated with factors related to sociodemographics (age, sex and occupation), the work environment and workplace conditions. Worker protection policies should be created for groups with a higher risk of fatal occupational injuries in Mexico. PMID:25790063

  9. Quantifying reactor safety margins: Application of CSAU (Code Scalability, Applicability and Uncertainty) methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    SciTech Connect

    Wulff, W.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Levy, S.; Rohatgi, U.S.; Wilson, G.E.; Zuber, N.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs.

  10. A Methodology For Performing Global Uncertainty And Sensitivity Analysis In Systems Biology

    PubMed Central

    Marino, Simeone; Hogue, Ian B.; Ray, Christian J.; Kirschner, Denise E.

    2008-01-01

    Accuracy of results from mathematical and computer models of biological systems is often complicated by the presence of uncertainties in experimental data that are used to estimate parameter values. Current mathematical modeling approaches typically use either single-parameter or local sensitivity analyses. However, these methods do not accurately assess uncertainty and sensitivity in the system as, by default they hold all other parameters fixed at baseline values. Using techniques described within we demonstrate how a multi-dimensional parameter space can be studied globally so all uncertainties can be identified. Further, uncertainty and sensitivity analysis techniques can help to identify and ultimately control uncertainties. In this work we develop methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models. We compare two specific types of global sensitivity analysis indexes that have proven to be among the most robust and efficient. Through familiar and new examples of mathematical and computer models, we provide a complete methodology for performing these analyses, both in deterministic and stochastic settings, and propose novel techniques to handle problems encountered during this type of analyses. PMID:18572196

  11. Combined Molecular Algorithms for the Generation, Equilibration and Topological Analysis of Entangled Polymers: Methodology and Performance

    PubMed Central

    Karayiannis, Nikos Ch.; Kröger, Martin

    2009-01-01

    We review the methodology, algorithmic implementation and performance characteristics of a hierarchical modeling scheme for the generation, equilibration and topological analysis of polymer systems at various levels of molecular description: from atomistic polyethylene samples to random packings of freely-jointed chains of tangent hard spheres of uniform size. Our analysis focuses on hitherto less discussed algorithmic details of the implementation of both, the Monte Carlo (MC) procedure for the system generation and equilibration, and a postprocessing step, where we identify the underlying topological structure of the simulated systems in the form of primitive paths. In order to demonstrate our arguments, we study how molecular length and packing density (volume fraction) affect the performance of the MC scheme built around chain-connectivity altering moves. In parallel, we quantify the effect of finite system size, of polydispersity, and of the definition of the number of entanglements (and related entanglement molecular weight) on the results about the primitive path network. Along these lines we approve main concepts which had been previously proposed in the literature. PMID:20087477

  12. The Reactor Analysis Support Package (RASP): Volume 6, BWR (boiling water reactor) set-point methodology: Final report

    SciTech Connect

    Engel, R.E.

    1987-03-01

    The experience of recent years has demonstrated an ever increasing need for utilities operating nuclear power plants to have an in-depth understanding of the safety analyses that form the bases for plant operations. This report provides an overview of the current BWR methodology for developing inputs to the safety analysis process. The primary focus is on the development of operating envelope limits, instrument setpoints, and model inputs which satisfy safety analysis requirements and constrain plant operation. Specific emphasis is placed on the treatment of uncertainties in the development of safety analysis inputs. Discussion is also provided on the specific event analyses which combine to form the safety analysis and the identification of the event acceptance limits which are the figures of merit for the event analyses. Examples of the application of the safety analysis and setpoint methodology are also provided.

  13. Estimate of radionuclide release characteristics into containment under severe accident conditions. Final report

    SciTech Connect

    Nourbakhsh, H.P. [Brookhaven National Lab., Upton, NY (United States)

    1993-11-01

    A detailed review of the available light water reactor source term information is presented as a technical basis for development of updated source terms into the containment under severe accident conditions. Simplified estimates of radionuclide release and transport characteristics are specified for each unique combination of the reactor coolant and containment system combinations. A quantitative uncertainty analysis in the release to the containment using NUREG-1150 methodology is also presented.

  14. Processing of the GALILEOTM fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    NASA Astrophysics Data System (ADS)

    Mailhe, P.; Barbier, B.; Garnier, Ch.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, Ph.

    2014-06-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEOTM along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEOTM code benchmarking process on its extended experimental database and the GALILEOTM model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEOTM model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report for GALILEOTM to the U.S. NRC in 2013, GALILEOTM and its methodology are on the way to be industrially used in a wide range of irradiation conditions.

  15. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    NASA Astrophysics Data System (ADS)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth). To check/test the spatial proposal of the ground station site, this analysis methodology uses mission simulation software of spatial vehicles to analyze and quantify how the geographic accuracy of the position of the spatial vehicles along the horizon visible from the antenna, increases communication time with the ground station. Experimental results that have been obtained from a ground station located at ETSIT-UPM in Spain (QBito Nanosatellite, UPM spacecraft mission within the QB50 project) show that selection of the optimal site increases the field of view from the antenna and hence helps to meet mission requirements.

  16. CHEMICAL TRANSFORMATIONS IN ACID RAIN. VOLUME 1. NEW METHODOLOGIES FOR SAMPLING AND ANALYSIS OF GAS-PHASE PEROXIDE

    EPA Science Inventory

    New methodologies for the sampling and analysis of gas-phase peroxides (H2O2 and organic peroxides) using (a) diffusion denuder tubes and (b) gas-to-liquid transfer with prior removal of ozone have been investigated. The purpose was to develop an interference-free method for dete...

  17. Optimisation of the fatigue resistance of 2024-T351 aluminium alloys by controlled shot peening—methodology, results and analysis

    Microsoft Academic Search

    C. A. Rodopoulos; S. A. Curtis; E. R. de los Rios; J. SolisRomero

    2004-01-01

    A methodology dedicated to the optimisation of the fatigue properties of aluminium alloys by controlled shot peening is presented. Selection of the peening conditions is made out of the use of the Design of Experiment and the Effects Neutralisation Model. Both techniques allowed the optimisation both in terms of life and crack growth rates. Experimental determination and further analysis of

  18. Methodology for using prompt gamma activation analysis to measure the binary diffusion coefficient of a gas in a porous medium

    E-print Network

    Deinert, Mark

    Methodology for using prompt gamma activation analysis to measure the binary diffusion coefficient to determine the binary diffusion coefficients of a gas in a porous system. Argon diffusion experiments were- suring the binary diffusion coefficient of a gas in a geological med- ium using prompt gamma activation

  19. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  20. 49 CFR Appendix B to Part 219 - Designation of Laboratory for Post-Accident Toxicological Testing

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...Post-Accident Toxicological Testing B Appendix B to Part 219...TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Pt. 219, App. B ...Post-Accident Toxicological Testing The following laboratory...post-accident toxicological analysis under subpart C of this...

  1. 49 CFR Appendix B to Part 219 - Designation of Laboratory for Post-Accident Toxicological Testing

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...Post-Accident Toxicological Testing B Appendix B to Part 219...TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Pt. 219, App. B ...Post-Accident Toxicological Testing The following laboratory...post-accident toxicological analysis under subpart C of...

  2. 49 CFR Appendix B to Part 219 - Designation of Laboratory for Post-Accident Toxicological Testing

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Post-Accident Toxicological Testing B Appendix B to Part 219...TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Pt. 219, App. B ...Post-Accident Toxicological Testing The following laboratory...post-accident toxicological analysis under subpart C of...

  3. 49 CFR Appendix B to Part 219 - Designation of Laboratory for Post-Accident Toxicological Testing

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...Post-Accident Toxicological Testing B Appendix B to Part 219...TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Pt. 219, App. B ...Post-Accident Toxicological Testing The following laboratory...post-accident toxicological analysis under subpart C of this...

  4. 49 CFR Appendix B to Part 219 - Designation of Laboratory for Post-Accident Toxicological Testing

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...Post-Accident Toxicological Testing B Appendix B to Part 219...TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Pt. 219, App. B ...Post-Accident Toxicological Testing The following laboratory...post-accident toxicological analysis under subpart C of...

  5. Accident cost saving and highway attributes

    Microsoft Academic Search

    David J. Forkenbrock; Norman S. J. Foster

    1997-01-01

    Two semi-logarithmic regression models are developed to estimate accident rates and accident costs, respectively, for rural non-interstate highways in the state of Iowa. Data on 21,224 accidents occurring between 1989 and 1991 on 17,767 road segments are used in the analysis. Seven road attributes of these road segments are included as predictor variables. Applying the resulting regression models to a

  6. A methodology for the structural and functional analysis of signaling and regulatory networks

    PubMed Central

    Klamt, Steffen; Saez-Rodriguez, Julio; Lindquist, Jonathan A; Simeoni, Luca; Gilles, Ernst D

    2006-01-01

    Background Structural analysis of cellular interaction networks contributes to a deeper understanding of network-wide interdependencies, causal relationships, and basic functional capabilities. While the structural analysis of metabolic networks is a well-established field, similar methodologies have been scarcely developed and applied to signaling and regulatory networks. Results We propose formalisms and methods, relying on adapted and partially newly introduced approaches, which facilitate a structural analysis of signaling and regulatory networks with focus on functional aspects. We use two different formalisms to represent and analyze interaction networks: interaction graphs and (logical) interaction hypergraphs. We show that, in interaction graphs, the determination of feedback cycles and of all the signaling paths between any pair of species is equivalent to the computation of elementary modes known from metabolic networks. Knowledge on the set of signaling paths and feedback loops facilitates the computation of intervention strategies and the classification of compounds into activators, inhibitors, ambivalent factors, and non-affecting factors with respect to a certain species. In some cases, qualitative effects induced by perturbations can be unambiguously predicted from the network scheme. Interaction graphs however, are not able to capture AND relationships which do frequently occur in interaction networks. The consequent logical concatenation of all the arcs pointing into a species leads to Boolean networks. For a Boolean representation of cellular interaction networks we propose a formalism based on logical (or signed) interaction hypergraphs, which facilitates in particular a logical steady state analysis (LSSA). LSSA enables studies on the logical processing of signals and the identification of optimal intervention points (targets) in cellular networks. LSSA also reveals network regions whose parametrization and initial states are crucial for the dynamic behavior. We have implemented these methods in our software tool CellNetAnalyzer (successor of FluxAnalyzer) and illustrate their applicability using a logical model of T-Cell receptor signaling providing non-intuitive results regarding feedback loops, essential elements, and (logical) signal processing upon different stimuli. Conclusion The methods and formalisms we propose herein are another step towards the comprehensive functional analysis of cellular interaction networks. Their potential, shown on a realistic T-cell signaling model, makes them a promising tool. PMID:16464248

  7. Predictive tolerance and sensitivity analysis based on parametric response surface methodology

    Microsoft Academic Search

    M. Quarantelli; L. Daldoss; P. Gubian; C. Guardiani

    1998-01-01

    An innovative methodology aimed at deriving predictive response surface models for the variability of VLSI mixed-signal basic building block performances is explored in this paper. The purpose of this methodology is two-fold. First, we plan to be able to predict the sensitivity to process variations of basic ingredients of VLSI design, such as embedded memories, full custom and analog components.

  8. A Classification System for 2-Year Postsecondary Institutions. Methodology Report. Postsecondary Education Descriptive Analysis Reports.

    ERIC Educational Resources Information Center

    Phipps, Ronald A.; Shedd, Jessica M.; Merisotis, Jamie P.

    This methodology report by the National Center for Education Statistics (NCES) outlines the need and rationale for a two-year postsecondary classification system and the methodology used to produce this classification system. The system was created based on information from the Integrated Postsecondary Education Data System (IPEDS) database that…

  9. Cost Analysis for Educational Planning and Evaluation: Methodology and Application to Instructional Technology.

    ERIC Educational Resources Information Center

    Jamison, Dean T.; And Others

    A methodology is presented which assists government decision makers in making cost analyses of ongoing and future educational projects. Part one develops the methodology in general terms, and part two illustrates its application by examining the cost structure of instructional radio and television projects in developing countries. Part three…

  10. APET methodology for Defense Waste Processing Facility: Mode C operation

    SciTech Connect

    Taylor, R.P. Jr.; Massey, W.M.

    1995-04-01

    Safe operation of SRS facilities continues to be the highest priority of the Savannah River Site (SRS). One of these facilities, the Defense Waste Processing Facility or DWPF, is currently undergoing cold chemical runs to verify the design and construction preparatory to hot startup in 1995. The DWPFF is a facility designed to convert the waste currently stored in tanks at the 200-Area tank farm into a form that is suitable for long term storage in engineered surface facilities and, ultimately, geologic isolation. As a part of the program to ensure safe operation of the DWPF, a probabilistic Safety Assessment of the DWPF has been completed. The results of this analysis are incorporated into the Safety Analysis Report (SAR) for DWPF. The usual practice in preparation of Safety Analysis Reports is to include only a conservative analysis of certain design basis accidents. A major part of a Probabilistic Safety Assessment is the development and quantification of an Accident Progression Event Tree or APET. The APET provides a probabilistic representation of potential sequences along which an accident may progress. The methodology used to determine the risk of operation of the DWPF borrows heavily from methods applied to the Probabilistic Safety Assessment of SRS reactors and to some commercial reactors. This report describes the Accident Progression Event Tree developed for the Probabilistic Safety Assessment of the DWPF.

  11. Scaling analysis of ocean surface turbulent heterogeneities from satellite remote sensing: a methodological study.

    NASA Astrophysics Data System (ADS)

    Pannimpullath Remanan, Renosh; Schmitt, Francois; Loisel, Hubert

    2015-04-01

    Satellite remote sensing observation allow the ocean surface to be sampled synoptically over large spatio-temporal scales. The images provided by ocean colour satellites are widely used in physical, biological and ecological oceanography. The present work proposes a method for understanding the multi-scaling properties of satellite ocean colour products such as Chlorophyll-a (Chl-a) Sea Surface Temperature (SST), rarely studied. The specific objectives of this study are to show how the small scale heterogeneities of satellite images can be characterized using tools borrowed from the fields of turbulence, and how these patterns are related to environmental conditions. For that purpose, we show how the structure function, which is classical for scaling time series analysis, can be used also in 2D. The main advantage of this method is that it can be used for images which have missing data. We show, using a simulation and two real images taken as examples that coarse-graining (CG) of a gradient modulus transform of the original image does not provide correct scaling exponents. We show, using a fractional Brownian simulation in 2D, that the structure function (SF) can be used with randomly sampled couple of points, and verify that 1 million of couple of points provides enough statistics. We illustrate this methodology using two satellite images chosen as examples.

  12. RADON AND PROGENY ALPHA-PARTICLE ENERGY ANALYSIS USING NUCLEAR TRACK METHODOLOGY

    SciTech Connect

    Espinosa Garcia, Guillermo [ORNL; Golzarri y Moreno, Dr. Jose Ignacio [Instituto de Fisica, Mexico; Bogard, James S [ORNL

    2008-01-01

    A preliminary procedure for alpha energy analysis of radon and progeny using Nuclear Track Methodology (NTM) is described in this paper. The method is based on the relationship between alpha-particle energies deposited in polycarbonate material (CR-39) and the track size developed after a well-established chemical etching process. Track geometry, defined by parameters such as major or minor diameters, track area and overall track length, is shown to correlate with alpha-particle energy over the range 6.00 MeV (218Po) to 7.69 MeV (214Po). Track features are measured and the data analyzed automatically using a digital imaging system and commercial PC software. Examination of particle track diameters in CR-39 exposed to environmental radon reveals a multi-modal distribution. Locations of the maxima in this distribution are highly correlated with alpha particle energies of radon daughters, and the distributions are sufficiently resolved to identify the radioisotopes. This method can be useful for estimating the radiation dose from indoor exposure to radon and its progeny.

  13. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    NASA Technical Reports Server (NTRS)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  14. Applying sequential injection analysis (SIA) and response surface methodology for optimization of Fenton-based processes.

    PubMed

    dos Santos, Allan C V; Masini, Jorge C

    2009-01-15

    This work presents the use of sequential injection analysis (SIA) and the response surface methodology as a tool for optimization of Fenton-based processes. Alizarin red S dye (C.I. 58005) was used as a model compound for the anthraquinones family, whose pigments have a large use in coatings industry. The following factors were considered: [H(2)O(2)]:[Alizarin] and [H(2)O(2)]:[FeSO(4)] ratios and pH. The SIA system was designed to add reagents to the reactor and to perform on-line sampling of the reaction medium, sending the samples to a flow-through spectrophotometer for monitoring the color reduction of the dye. The proposed system fed the statistical program with degradation data for fast construction of response surface plots. After optimization, 99.7% of the dye was degraded and the TOC content was reduced to 35% of the original value. Low reagents consumption and high sampling throughput were the remarkable features of the SIA system. PMID:19064095

  15. Cointegration methodology for psychological researchers: An introduction to the analysis of dynamic process systems.

    PubMed

    Stroe-Kunold, Esther; Gruber, Antje; Stadnytska, Tetiana; Werner, Joachim; Brosig, Burkhard

    2012-11-01

    Longitudinal data analysis focused on internal characteristics of a single time series has attracted increasing interest among psychologists. The systemic psychological perspective suggests, however, that many long-term phenomena are mutually interconnected, forming a dynamic system. Hence, only multivariate methods can handle such human dynamics appropriately. Unlike the majority of time series methodologies, the cointegration approach allows interdependencies of integrated (i.e., extremely unstable) processes to be modelled. This advantage results from the fact that cointegrated series are connected by stationary long-run equilibrium relationships. Vector error-correction models are frequently used representations of cointegrated systems. They capture both this equilibrium and compensation mechanisms in the case of short-term deviations due to developmental changes. Thus, the past disequilibrium serves as explanatory variable in the dynamic behaviour of current variables. Employing empirical data from cognitive psychology, psychosomatics, and marital interaction research, this paper describes how to apply cointegration methods to dynamic process systems and how to interpret the parameters under investigation from a psychological perspective. PMID:22070760

  16. Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization

    NASA Astrophysics Data System (ADS)

    Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel

    2013-05-01

    The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.

  17. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    SciTech Connect

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  18. Radiation dose analysis of a PWR 1 accident for the projected reactor site at Cementon, New York

    Microsoft Academic Search

    Huncharek

    1976-01-01

    This study is an evaluation of a pressurized-water reactor (PWR) accident as defined by WASH 1400 for the proposed nuclear reactor site at Cementon, N. Y. Using an extension of the Environmental Protection Agency's AIREM computer code, the following were analyzed for up to 50 miles for 16 compass direction: (1) whole-body doses due to cloud submersion, inhalation, and ground

  19. Analysis of the source range monitor during the first four hours of the Three Mile Island Unit 2 accident

    Microsoft Academic Search

    H. Y. Wu; B. R. Bandini; M. Y. Hsiao; A. J. Baratta; E. L. Tolman

    1989-01-01

    The source range monitor (SRM) data recorded during the first 4 h of the Three Mile Island Unit 2 (TMI-2) accident following reactor shutdown were analyzed. An effort to simulate the actual SRM response was made by performing a series of neutron transport calculations. Primary emphasis was placed on simulating the changes in SRM response to various system events during

  20. Analysis of Sodium Fire in the Containment Building of Prototype Fast Breeder Reactor Under the Scenario of Core Disruptive Accident

    Microsoft Academic Search

    P. M. Rao; N. Kasinathan; S. E. Kannan

    2006-01-01

    The potential for sodium release to reactor containment building from reactor assembly during Core Disruptive Accident (CDA) in Fast Breeder Reactors (FBR) is an important safety issue with reference to the structural integrity of Reactor Containment Building (RCB). For Prototype Fast Breeder Reactor (PFBR), the estimated sodium release under a CDA of 100 MJ energy release is 350 kg. The