Science.gov

Sample records for accident analysis computer

  1. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  2. SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident

    NASA Astrophysics Data System (ADS)

    Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

    2014-06-01

    On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

  3. WASTE-ACC: A computer model for analysis of waste management accidents

    SciTech Connect

    Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

    1996-12-01

    In support of the U.S. Department of Energy`s (DOE`s) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives.

  4. Nuclear fuel cycle facility accident analysis handbook

    SciTech Connect

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  5. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  6. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  7. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    SciTech Connect

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.

  8. Accident Tolerant Fuel Analysis

    SciTech Connect

    Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  9. Accident tolerant fuel analysis

    SciTech Connect

    Smith, Curtis; Chichester, Heather; Johns, Jesse; Teague, Melissa; Tonks, Michael Idaho National Laboratory; Youngblood, Robert

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  10. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  11. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    SciTech Connect

    Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A.; Jackson, R.; TenBrook, W.; Russell, J. |

    1994-02-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force.

  12. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.

    1983-01-01

    The aircraft accident data recorded by the National Transportation and Safety Board (NTSR) for 1964-1979 were analyzed to determine what problems exist in the general aviation (GA) single pilot instrument flight rule (SPIFR) environment. A previous study conducted in 1978 for the years 1964-1975 provided a basis for comparison. This effort was generally limited to SPIFR pilot error landing phase accidents but includes some SPIFR takeoff and enroute accident analysis as well as some dual pilot IFR accident analysis for comparison. Analysis was performed for 554 accidents of which 39% (216) occurred during the years 1976-1979.

  13. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  14. Analysis of tritium mission FMEF/FAA fuel handling accidents

    SciTech Connect

    Van Keuren, J.C.

    1997-11-18

    The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

  15. An analysis of aircraft accidents involving fires

    NASA Technical Reports Server (NTRS)

    Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

    1975-01-01

    All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

  16. Safety analysis of surface haulage accidents

    SciTech Connect

    Randolph, R.F.; Boldt, C.M.K.

    1996-12-31

    Research on improving haulage truck safety, started by the U.S. Bureau of Mines, is being continued by its successors. This paper reports the orientation of the renewed research efforts, beginning with an update on accident data analysis, the role of multiple causes in these accidents, and the search for practical methods for addressing the most important causes. Fatal haulage accidents most often involve loss of control or collisions caused by a variety of factors. Lost-time injuries most often involve sprains or strains to the back or multiple body areas, which can often be attributed to rough roads and the shocks of loading and unloading. Research to reduce these accidents includes improved warning systems, shock isolation for drivers, encouraging seatbelt usage, and general improvements to system and task design.

  17. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  18. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  19. Accident analysis for US fast burst reactors

    SciTech Connect

    Paternoster, R.; Flanders, M.; Kazi, H.

    1994-09-01

    In the US fast burst reactor (FBR) community there has been increasing emphasis and scrutiny on safety analysis and understanding of possible accident scenarios. This paper summarizes recent work in these areas that is going on at the different US FBR sites. At this time, all of the FBR facilities have or in the process of updating and refining their accident analyses. This effort is driven by two objectives: to obtain a more realistic scenario for emergency response procedures and contingency plans, and to determine compliance with changing regulatory standards.

  20. Time-dependent accident sequence analysis

    SciTech Connect

    Chu, T.L.

    1983-01-01

    One problem of the current event tree methodology is that the transitions between accident sequences are not modeled. The causes of transitions are mostly due to operator actions during an accident. A model for such transitions is presented. A generalized algorithm is used for quantification. In the more realistic accident analysis, the progression of the physical processes, which determines the time available for proper operators response, is modeled. Furthermore, the uncertainty associated with the physical modeling is considered. As an example, the approach is applied to analyze TMI-type accidents. Statistical evidence is collected and used in assessing the frequency of stuck-open pressure operated relief valve at B and W plants as well as the frequency of misdiagnosis. Statistical data are also used in modeling the timing of operator actions during the accident. A thermal code (CUT) is developed to determine the time at which the core uncovery occurs. A response surface is used to propagate the uncertainty associated with the thermal code.

  1. Computer simulation of the accident with nine victims.

    PubMed

    Balazic, Joze; Prebil, Ivan; Certanc, Niko

    2006-01-27

    In the paper we wish to emphasise the significance of vehicle driving dynamics analysis in the collision phase and occupant load analysis by means of using a software environment. Thereby we also wish to present the results of the simulation of the course of a traffic accident with nine victims that arose from a collision between an Audi A6 passenger car and the VW Caravelle van. In treating the traffic accident the forensic expert was faced with the questions about what caused the injuries to the front passenger in the Audi A6 passenger car, about the way the two vehicles had collided, about their collision velocities, about the way the two vehicles were handled and about the causes that originated the traffic accident. The critical situation on the road was a consequence of the tiredness of the van driver, the inadequate use of the passive safety systems and overloading the van. PMID:16410168

  2. Waste Evaporator Accident Simulation Using RELAP5 Computer Code

    SciTech Connect

    POLIZZI, L.M.

    2004-04-28

    An evaporator is used on liquid waste from processing facilities to reduce the volume of the waste through heating the waste and allowing some of the water to be separated from the waste through boiling. This separation process allows for more efficient processing and storage of liquid waste. Commonly, the liquid waste consists of an aqueous solution of chemicals that over time could induce corrosion, and in turn weaken the tubes in the steam tube bundle of the waste evaporator that are used to heat the waste. This chemically induced corrosion could escalate into a possible tube leakage and/or the severance of a tube(s) in the tube bundle. In this paper, analyses of a waste evaporator system for the processing of liquid waste containing corrosive chemicals are presented to assess the system response to this accident scenario. This accident scenario is evaluated since its consequences can propagate to a release of hazardous material to the outside environment. It is therefore important to ensure that the evaporator system component structural integrity is not compromised, i.e. the design pressure and temperature of the system is not exceeded during the accident transient. The computer code used for the accident simulation is RELAP5-MOD31. The accident scenario analyzed includes a double-ended guillotine break of a tube in the tube bundle of the evaporator. A mitigated scenario is presented to evaluate the excursion of the peak pressure and temperature in the various components of the evaporator system to assess whether the protective actions and controls available are adequate to ensure that the structural integrity of the evaporator system is maintained and that no atmospheric release occurs.

  3. Summary of the SRS Severe Accident Analysis Program, 1987--1992

    SciTech Connect

    Long, T.A.; Hyder, M.L.; Britt, T.E.; Allison, D.K.; Chow, S.; Graves, R.D.; DeWald, A.B. Jr.; Monson, P.R. Jr.; Wooten, L.A.

    1992-11-01

    The Severe Accident Analysis Program (SAAP) is a program of experimental and analytical studies aimed at characterizing severe accidents that might occur in the Savannah River Site Production Reactors. The goals of the Severe Accident Analysis Program are: To develop an understanding of severe accidents in SRS reactors that is adequate to support safety documentation for these reactors, including the Safety Analysis Report (SAR), the Probabilistic Risk Assessment (PRA), and other studies evaluating the safety of reactor operation; To provide tools and bases for the evaluation of existing or proposed safety related equipment in the SRS reactors; To provide bases for the development of accident management procedures for the SRS reactors; To develop and maintain on the site a sufficient body of knowledge, including documents, computer codes, and cognizant engineers and scientists, that can be used to authoritatively resolve questions or issues related to reactor accidents. The Severe Accident Analysis Program was instituted in 1987 and has already produced a substantial amount of information, and specialized calculational tools. Products of the Severe Accident Analysis Program (listed in Section 9 of this report) have been used in the development of the Safety Analysis Report (SAR) and the Probabilistic Risk Assessment (PRA), and in the development of technical specifications for the SRS reactors. A staff of about seven people is currently involved directly in the program and in providing input on severe accidents to other SRS activities.

  4. Cross-analysis of hazmat road accidents using multiple databases.

    PubMed

    Trépanier, Martin; Leroux, Marie-Hélène; de Marcellis-Warin, Nathalie

    2009-11-01

    Road selection for hazardous materials transportation relies heavily on risk analysis. With risk being generally expressed as a product of the probability of occurrence and the expected consequence, one will understand that risk analysis is data intensive. However, various authors have noticed the lack of statistical reliability of hazmat accident databases due to the systematic underreporting of such events. Also, official accident databases alone are not always providing all the information required (economical impact, road conditions, etc.). In this paper, we attempt to integrate many data sources to analyze hazmat accidents in the province of Quebec, Canada. Databases on dangerous goods accidents, road accidents and work accidents were cross-analyzed. Results show that accidents can hardly be matched and that these databases suffer from underreporting. Police records seem to have better coverage than official records maintained by hazmat authorities. Serious accidents are missing from government's official databases (some involving deaths or major spills) even though their declaration is mandatory. PMID:19819367

  5. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  6. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  7. Canister storage building design basis accident analysis documentation

    SciTech Connect

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  8. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  9. Decontamination analysis of the NUWAX-83 accident site using DECON

    SciTech Connect

    Tawil, J.J.

    1983-11-01

    This report presents an analysis of the site restoration options for the NUWAX-83 site, at which an exercise was conducted involving a simulated nuclear weapons accident. This analysis was performed using a computer program deveoped by Pacific Northwest Laboratory. The computer program, called DECON, was designed to assist personnel engaged in the planning of decontamination activities. The many features of DECON that are used in this report demonstrate its potential usefulness as a site restoration planning tool. Strategies that are analyzed with DECON include: (1) employing a Quick-Vac option, under which selected surfaces are vacuumed before they can be rained on; (2) protecting surfaces against precipitation; (3) prohibiting specific operations on selected surfaces; (4) requiring specific methods to be used on selected surfaces; (5) evaluating the trade-off between cleanup standards and decontamination costs; and (6) varying of the cleanup standards according to expected exposure to surface.

  10. Reactor Safety Gap Evaluation of Accident Tolerant Components and Severe Accident Analysis

    SciTech Connect

    Farmer, Mitchell T.; Bunt, R.; Corradini, M.; Ellison, Paul B.; Francis, M.; Gabor, John D.; Gauntt, R.; Henry, C.; Linthicum, R.; Luangdilok, W.; Lutz, R.; Paik, C.; Plys, M.; Rabiti, Cristian; Rempe, J.; Robb, K.; Wachowiak, R.

    2015-01-31

    The overall objective of this study was to conduct a technology gap evaluation on accident tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist, given the current state of light water reactor (LWR) severe accident research, and additionally augmented by insights obtained from the Fukushima accident. The ultimate benefit of this activity is that the results can be used to refine the Department of Energy’s (DOE) Reactor Safety Technology (RST) research and development (R&D) program plan to address key knowledge gaps in severe accident phenomena and analyses that affect reactor safety and that are not currently being addressed by the industry or the Nuclear Regulatory Commission (NRC).

  11. [An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].

    PubMed

    Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

    1990-03-01

    The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2131982

  12. Accident progression event tree analysis for postulated severe accidents at N Reactor

    SciTech Connect

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. ); Medford, G.T. )

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  13. TMI-2 accident: core heat-up analysis

    SciTech Connect

    Ardron, K.H.; Cain, D.G.

    1981-01-01

    This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

  14. Development of Database for Accident Analysis in Indian Mines

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Guru Raghavendra Reddy, K.

    2015-08-01

    Mining is a hazardous industry and high accident rates associated with underground mining is a cause of deep concern. Technological developments notwithstanding, rate of fatal accidents and reportable incidents have not shown corresponding levels of decline. This paper argues that adoption of appropriate safety standards by both mine management and the government may result in appreciable reduction in accident frequency. This can be achieved by using the technology in improving the working conditions, sensitising workers and managers about causes and prevention of accidents. Inputs required for a detailed analysis of an accident include information on location, time, type, cost of accident, victim, nature of injury, personal and environmental factors etc. Such information can be generated from data available in the standard coded accident report form. This paper presents a web based application for accident analysis in Indian mines during 2001-2013. An accident database (SafeStat) prototype based on Intranet of the TCP/IP agreement, as developed by the authors, is also discussed.

  15. GASFLOW analysis of a tritium leak accident

    SciTech Connect

    Farman, R.F.; Fujita, R.K.; Travis, J.R.

    1994-09-01

    The consequences of an earthquake-induced fire involving a tritium leak were analyzed using the GASFLOW computer code. Modeling features required by the analysis include ventilation boundary conditions, flow of a gas mixture in an enclosure containing obstacles, thermally induced buoyancy, and combustion phenomena.

  16. NASA's Accident Precursor Analysis Process and the International Space Station

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Lutomski, Michael

    2010-01-01

    This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

  17. IAEA Activities in the Area of Safety Analysis and Accident Management

    SciTech Connect

    Lee, S.; El-Shanawany, M.

    2006-07-01

    Safety analysis is a means of demonstrating how critical safety functions, the integrity of barriers against the release of radioactive materials, and various other safety requirements are fulfilled for a broad range of operating conditions and initiating events. Accordingly, performing safety analysis for a nuclear power plant is one of the most important safety principles. Thermal-hydraulic computer codes are extensively used worldwide for safety analysis by utilities, regulatory authorities, power plant designers and vendors, nuclear fuel companies, research organizations, and technical support organizations. Safety analysis methodology and computer codes have seen a significant development over the last two decades. This fact is also reflected in the work of the International Atomic Energy Agency (IAEA) that aims at increasing the quality and international harmonization of the approaches used in safety analysis. The paper provides an overview of activities and of examples of results obtained recently or planned in the near future in the IAEA's Division of Nuclear Installation Safety in the field of safety analysis for both design basis accidents and beyond design basis accidents as well as accident management. In this paper, specific technical guidance on the safety assessments in the IAEA Safety Standards such as safety analysis methodologies, probabilistic safety assessment, and development of accident management programmes are described. Future trends and related activities in safety analysis and accident management are also introduced. (authors)

  18. Thermohydraulic and Safety Analysis for CARR Under Station Blackout Accident

    SciTech Connect

    Wenxi Tian; Suizheng Qiu; Guanghui Su; Dounan Jia; Xingmin Liu - China Institute of Atomic Energy

    2006-07-01

    A thermohydraulic and safety analysis code (TSACC) has been developed using Fortran 90 language to evaluate the transient thermohydraulic behaviors and safety characteristics of the China Advanced Research Reactor(CARR) under Station Blackout Accident(SBA). For the development of TSACC, a series of corresponding mathematical and physical models were considered. Point reactor neutron kinetics model was adopted for solving reactor power. All possible flow and heat transfer conditions under station blackout accident were considered and the optional models were supplied. The usual Finite Difference Method (FDM) was abandoned and a new model was adopted to evaluate the temperature field of core plate type fuel element. A new simple and convenient equation was proposed for the resolution of the transient behaviors of the main pump instead of the complicated four-quadrant model. Gear method and Adams method were adopted alternately for a better solution to the stiff differential equations describing the dynamic behaviors of the CARR. The computational result of TSACC showed the enough safety margin of CARR under SBA. For the purpose of Verification and Validation (V and V), the simulated results of TSACC were compared with those of Relap5/Mdo3. The V and V result indicated a good agreement between the results by the two codes. Because of the adoption of modular programming techniques, this analysis code is expected to be applied to other reactors by easily modifying the corresponding function modules. (authors)

  19. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  20. An analysis of pileup accidents in highway systems

    NASA Astrophysics Data System (ADS)

    Chang, Jau-Yang; Lai, Wun-Cing

    2016-02-01

    Pileup accident is a multi-vehicle collision occurring in the lane and producing by successive following vehicles. It is a special collision on highway. The probability of the occurrence of pileup accident is lower than that of the other accidents in highway systems. However, the pileup accident leads to injuries and damages which are often serious. In this paper, we analyze the occurrence of pileup accidents by considering the three types of dangerous collisions in highway systems. We evaluate those corresponding to rear-end collision, lane-changing collision, and double lane-changing collision. We simulate four road driving strategies to investigate the relationships between different vehicle collisions and pileup accidents. In accordance with the simulation and analysis, it is shown that the double lane-changing collisions result in an increase of the occurrence of pileup accidents. Additionally, we found that the probability of the occurrence of pileup accidents can be reduced when the speeds of vehicles are suitably constrained in highway systems.

  1. Analysis of Credible Accidents for Argonaut Reactors

    SciTech Connect

    Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

    1981-04-01

    Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

  2. Systemic accident analysis: examining the gap between research and practice.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2013-06-01

    The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

  3. OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT

    SciTech Connect

    KRIPPS, L.J.

    2005-02-18

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  4. Advanced accident sequence precursor analysis level 1 models

    SciTech Connect

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O.

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  5. Corporate cost of occupational accidents: an activity-based analysis.

    PubMed

    Rikhardsson, Pall M; Impgaard, Martin

    2004-03-01

    The systematic accident cost analysis (SACA) project was carried out during 2001 by The Aarhus School of Business and PricewaterhouseCoopers Denmark with financial support from The Danish National Working Environment Authority. Its focused on developing and testing a method for evaluating occupational costs of companies for use by occupational health and safety professionals. The method was tested in nine Danish companies within three different industry sectors and the costs of 27 selected occupational accidents in these companies were calculated. One of the main conclusions is that the SACA method could be used in all of the companies without revisions. The evaluation of accident cost showed that 2/3 of the costs of occupational accidents are visible in the Danish corporate accounting systems reviewed while 1/3 is hidden from management view. The highest cost of occupational accidents for a company with 3.600 employees was estimated to approximately US$ 682.000. The paper includes an introduction regarding accident cost analysis in companies, a presentation of the SACA project methodology and the SACA method itself, a short overview of some of the results of the SACA project and a conclusion. Further information about the project is available at http://www.asb.dk/saca. PMID:14642872

  6. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  7. Accident Sequence Evaluation Program: Human reliability analysis procedure

    SciTech Connect

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  8. Shipping container response to severe highway and railway accident conditions: Appendices

    SciTech Connect

    Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

    1987-02-01

    Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

  9. Probabilistic methods for accident-progression analysis

    SciTech Connect

    Jamali, K. M.

    1981-01-01

    Probabilistic methods that can be used as basis for deterministic calculations of transients or accidents in nuclear power plants are described. They include obtaining initiator-dependent sequences on the component level and related analyses, propagation of primary event uncertainties in the ranking of sequences, and detailed treatment of dependent failures. The results are shown for protected transients in the short term forced circulation phase of decay heat removal in the Clinch River Breeder Reactor. Higher values of unavailabilities are obtained than previous works as a result of more detailed common cause/mode failure modeling. The unavailability of decay heat removal by forced circulation for the loss of off-site power and loss of main feedwater system initiators is estimated at 4 x 10/sup -3//yr and 9 x 10/sup -3//yr, respectively. 15 refs., 1 fig., 2 tabs.

  10. Accident analysis of the windowless target system

    SciTech Connect

    Bianchi, F.; Ferri, R.

    2006-07-01

    Transmutation systems are able to reduce the radio-toxicity and amount of High-Level Wastes (HLW), which are the main concerns related to the peaceful use of nuclear energy, and therefore they should make nuclear energy more easily acceptable by population. A transmutation system consists of a sub-critical fast reactor, an accelerator and a Target System, where the spallation reactions needed to sustain the chain reaction take place. Three options were proposed for the Target System within the European project PDS-XADS (Preliminary Design Studies on an Experimental Accelerator Driven System): window, windowless and solid. This paper describes the constraints taken into account in the design of the windowless Target System for the large Lead-Bismuth-Eutectic cooled XADS and deals with the results of the calculations performed to assess the behaviour of the target during some accident sequences related to pump trips. (authors)

  11. MELCOR accident analysis for ARIES-ACT

    SciTech Connect

    Paul W. Humrickhouse; Brad J. Merrill

    2012-08-01

    We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

  12. FSAR fire accident analysis for a plutonium facility

    SciTech Connect

    Lam, K.

    1997-06-01

    The Final Safety Analysis Report (FSAR) for a plutonium facility as required by DOE Orders 5480.23 and 5480.22 has recently been completed and approved. The facility processes and stores radionuclides such as Pu-238, Pu-239, enriched uranium, and to a lesser degree other actinides. This facility produces heat sources. DOE Order 5480.23 and DOE-STD-3009-94 require analysis of different types of accidents (operational accidents such as fires, explosions, spills, criticality events, and natural phenomena such as earthquakes). The accidents that were analyzed quantitatively, or the Evaluation Basis Accidents (EBAs), were selected based on a multi-step screening process that utilizes extensively the Hazards Analysis (HA) performed for the facility. In the HA, specific accident scenarios, with estimated frequency and consequences, were developed for each identified hazard associated with facility operations and activities. Analysis of the EBAs and comparison of their consequences to the evaluation guidelines established the safety envelope for the facility and identified the safety-class structures, systems, and components. This paper discusses the analysis of the fire EBA. This fire accident was analyzed in relatively great detail in the FSAR because of its potential off-site consequences are more severe compared to other events. In the following, a description of the scenario is first given, followed by a brief summary of the methodology for calculating the source term. Finally, the author discuss how a key parameter affecting the source term, the leakpath factor, was determined, which is the focus of this paper.

  13. Human factors review for Severe Accident Sequence Analysis (SASA)

    SciTech Connect

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure.

  14. Code System for Toxic Gas Accident Analysis.

    2001-09-24

    Version 00 TOXRISK is an interactive program developed to aid in the evaluation of nuclear power plant control room habitability in the event of a nearby toxic material release. The program uses a model which is consistent with the approach described in the NRC Regulatory Guide 1.78. Release of the gas is treated as an initial puff followed by a continuous plume. The relative proportions of these as well as the plume release rate aremore » supplied by the user. Transport of the gas is modeled as a Gaussian distribution and occurs through the action of a constant velocity, constant direction wind. Dispersion or diffusion of the gas during transport is described by modified Pasquill-Gifford dispersion coefficients. Great flexibility is afforded the user in specifying the release description, meteorological conditions, relative geometry of the accident and plant, and the plant ventilation system characteristics. Two types of simulation can be performed: multiple case (parametric) studies and probabilistic analyses.« less

  15. Criticality accident dosimetry by chromosomal analysis.

    PubMed

    Voisin, P; Roy, L; Hone, P A; Edwards, A A; Lloyd, D C; Stephan, G; Romm, H; Groer, P G; Brame, R

    2004-01-01

    The technique of measuring the frequency of dicentric chromosomal aberrations in blood lymphocytes was used to estimate doses in a simulated criticality accident. The simulation consisted of three exposures; approximately 5 Gy with a bare source and 1 and 2 Gy with a lead-shielded source. Three laboratories made separate estimates of the doses. These were made by the iterative method of apportioning the observed dicentric frequencies between the gamma and neutron components, taking account of a given gamma/neutron dose ratio, and referring the separated dicentric frequencies to dose-response calibration curves. An alternative method, based on Bayesian ideas, was employed. This was developed for interpreting dicentric frequencies in situations where the gamma/neutron ratio is uncertain. Both methods gave very similar results. One laboratory produced dose estimates close to the eventual exercise reference doses and the other laboratories estimated slightly higher values. The main reason for the higher values was the calibration relationships for fission neutrons. PMID:15353688

  16. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  17. Severe Accident Analysis Code SAMPSON Improvement for IMPACT Project

    NASA Astrophysics Data System (ADS)

    Ujita, Hiroshi; Ikeda, Takashi; Naitoh, Masanori

    SAMPSON is the integral code for severe accident analysis in detail with modular structure, developed in the IMPACT project. Each module can run independently and communication with multiple analysis modules supervised by the analysis control module makes an integral analysis possible. At the end of Phase 1 (1994-1997), demonstration simulations by combinations of up to 11 analysis modules had been performed and physical models in the code had been verified by separate-effect tests and validated by inegral tests. Multi-dimensional mechanistic models and theoretical-based conservation equations have been applied, during Phase 2 (1998-2000). New models for Accident Management evaluation have been also developed. Verificaton and validation have been performed by analysing separate-effect tests and inegral tests, while actual plant analyses are also being in progress.

  18. Computational engine structural analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Johns, R. H.

    1986-01-01

    A significant research activity at the NASA Lewis Research Center is the computational simulation of complex multidisciplinary engine structural problems. This simulation is performed using computational engine structural analysis (CESA) which consists of integrated multidisciplinary computer codes in conjunction with computer post-processing for problem-specific application. A variety of the computational simulations of specific cases are described in some detail in this paper. These case studies include: (1) aeroelastic behavior of bladed rotors, (2) high velocity impact of fan blades, (3) blade-loss transient response, (4) rotor/stator/squeeze-film/bearing interaction, (5) blade-fragment/rotor-burst containment, and (6) structural behavior of advanced swept turboprops. These representative case studies are selected to demonstrate the breath of the problems analyzed and the role of the computer including post-processing and graphical display of voluminous output data.

  19. INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS

    SciTech Connect

    D.A. Kalinich

    1999-09-27

    Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

  20. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  1. Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident

    NASA Technical Reports Server (NTRS)

    Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

    1994-01-01

    Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

  2. BNL severe-accident sequence experiments and analysis program. [PWR; BWR

    SciTech Connect

    Greene, G.A.; Ginsberg, T.; Tutu, N.K.

    1983-01-01

    In the analysis of degraded core accidents, the two major sources of pressure loading on light water reactor containments are: steam generation from core debris-water thermal interactions; and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described.

  3. Computation of the thermohydraulics in subassemblies for accident situations

    NASA Astrophysics Data System (ADS)

    Basque, G.

    Single-phase and two phase flow models of sodium boiling in an LMFBR are described. Results for single-phase calculations with mixed and natural convection, and for voiding of a subassembly are presented. A 2-D two-phase version of the computer code BACCHUS based on a homogeneous flow model with disequilibrium was tested. It yields results which are computationally stable and physically consistent. The validity of the porous body model for liquid flow is established by calculating representative single flows.

  4. [Retrospective Cytogenetic Dose Evaluation. II. Computer Data Processing in Persons Irradiated in Different Radiation Accidents].

    PubMed

    Nugis, V Yu; Khvostunov, I K; Goloub, E V; Kozlova, M G; Nadejinal, N M; Galstian, I A

    2015-01-01

    The method for retrospective dose assessment based on the analysis of cell distribution by the number of dicentrics and unstable aberrations using a special computer program was earlier developed based on the data about the persons irradiated as a result of the accident at the Chernobyl nuclear power plant. This method was applied for the same purpose for data processing of repeated cytogenetic studies of the patients exposed to γ-, γ-β- or γ-neutron radiation in various situations. As a whole, this group was followed up in more distant periods (17-50 years) after exposure than Chernobyl patients (up to 25 years). The use for retrospective dose assessment of the multiple regression equations obtained for the Chernobyl cohort showed that the equation, which includes computer recovered estimate of the dose and the time elapsed after irradiation, was generally unsatisfactory (r = 0.069 at p = 0.599). Similar equations with recovered estimate of the dose and frequency of abnormal chromosomes in a distant period or with all three parameters as variables gave better results (r = 0.686 at p = 0.000000001 and r = 0.542 at p = 0.000008, respectively). PMID:26863777

  5. Nuclear fuel cycle facility accident analysis handbook

    SciTech Connect

    1998-03-01

    The purpose of this Handbook is to provide guidance on how to calculate the characteristics of releases of radioactive materials and/or hazardous chemicals from nonreactor nuclear facilities. In addition, the Handbook provides guidance on how to calculate the consequences of those releases. There are four major chapters: Hazard Evaluation and Scenario Development; Source Term Determination; Transport Within Containment/Confinement; and Atmospheric Dispersion and Consequences Modeling. These chapters are supported by Appendices, including: a summary of chemical and nuclear information that contains descriptions of various fuel cycle facilities; details on how to calculate the characteristics of source terms for releases of hazardous chemicals; a comparison of NRC, EPA, and OSHA programs that address chemical safety; a summary of the performance of HEPA and other filters; and a discussion of uncertainties. Several sample problems are presented: a free-fall spill of powder, an explosion with radioactive release; a fire with radioactive release; filter failure; hydrogen fluoride release from a tankcar; a uranium hexafluoride cylinder rupture; a liquid spill in a vitrification plant; and a criticality incident. Finally, this Handbook includes a computer model, LPF No.1B, that is intended for use in calculating Leak Path Factors. A list of contributors to the Handbook is presented in Chapter 6. 39 figs., 35 tabs.

  6. a Study of the Reconstruction of Accidents and Crime Scenes Through Computational Experiments

    NASA Astrophysics Data System (ADS)

    Park, S. J.; Chae, S. W.; Kim, S. H.; Yang, K. M.; Chung, H. S.

    Recently, with an increase in the number of studies of the safety of both pedestrians and passengers, computer software, such as MADYMO, Pam-crash, and LS-dyna, has been providing human models for computer simulation. Although such programs have been applied to make machines beneficial for humans, studies that analyze the reconstruction of accidents or crime scenes are rare. Therefore, through computational experiments, the present study presents reconstructions of two questionable accidents. In the first case, a car fell off the road and the driver was separated from it. The accident investigator was very confused because some circumstantial evidence suggested the possibility that the driver was murdered. In the second case, a woman died in her house and the police suspected foul play with her boyfriend as a suspect. These two cases were reconstructed using the human model in MADYMO software. The first case was eventually confirmed as a traffic accident in which the driver bounced out of the car when the car fell off, and the second case was proved to be suicide rather than homicide.

  7. Accident analysis of heavy water cooled thorium breeder reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Takaki, Naoyuki

    2015-04-01

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  8. Accident analysis of heavy water cooled thorium breeder reactor

    SciTech Connect

    Yulianti, Yanti; Su’ud, Zaki; Takaki, Naoyuki

    2015-04-16

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  9. Guide for licensing evaluations using CRAC2: A computer program for calculating reactor accident consequences

    SciTech Connect

    White, J.E.; Roussin, R.W.; Gilpin, H.

    1988-12-01

    A version of the CRAC2 computer code applicable for use in analyses of consequences and risks of reactor accidents in case work for environmental statements has been implemented for use on the Nuclear Regulatory Commission Data General MV/8000 computer system. Input preparation is facilitated through the use of an interactive computer program which operates on an IBM personal computer. The resulting CRAC2 input deck is transmitted to the MV/8000 by using an error-free file transfer mechanism. To facilitate the use of CRAC2 at NRC, relevant background material on input requirements and model descriptions has been extracted from four reports - ''Calculations of Reactor Accident Consequences,'' Version 2, NUREG/CR-2326 (SAND81-1994) and ''CRAC2 Model Descriptions,'' NUREG/CR-2552 (SAND82-0342), ''CRAC Calculations for Accident Sections of Environmental Statements, '' NUREG/CR-2901 (SAND82-1693), and ''Sensitivity and Uncertainty Studies of the CRAC2 Computer Code,'' NUREG/CR-4038 (ORNL-6114). When this background information is combined with instructions on the input processor, this report provides a self-contained guide for preparing CRAC2 input data with a specific orientation toward applications on the MV/8000. 8 refs., 11 figs., 10 tabs.

  10. Analysis of PWR RCS Injection Strategy During Severe Accident

    SciTech Connect

    Wang, S.-J.; Chiang, K.-S.; Chiang, S.-C.

    2004-05-15

    Reactor coolant system (RCS) injection is an important strategy for severe accident management of a pressurized water reactor (PWR) system. Maanshan is a typical Westinghouse PWR nuclear power plant (NPP) with large, dry containment. The severe accident management guideline (SAMG) of Maanshan NPP is developed based on the Westinghouse Owners Group (WOG) SAMG.The purpose of this work is to analyze the RCS injection strategy of PWR system in an overheated core condition. Power is assumed recovered as the vessel water level drops to the bottom of active fuel. The Modular Accident Analysis Program version 4.0.4 (MAAP4) code is chosen as a tool for analysis. A postulated station blackout sequence for Maanshan NPP is cited as a reference case for this analysis. The hot leg creep rupture occurs during the mitigation action with immediate injection after power recovery according to WOG SAMG, which is not desired. This phenomenon is not considered while developing the WOG SAMG. Two other RCS injection methods are analyzed by using MAAP4. The RCS injection strategy is modified in the Maanshan SAMG. These results can be applied for typical PWR NPPs.

  11. Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

  12. Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

  13. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate

  14. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  15. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  16. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    SciTech Connect

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  17. Comprehensive Analysis of Two Downburst-Related Aircraft Accidents

    NASA Technical Reports Server (NTRS)

    Shen, J.; Parks, E. K.; Bach, R. E.

    1996-01-01

    Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

  18. Calculations of reactor-accident consequences, Version 2. CRAC2: computer code user's guide

    SciTech Connect

    Ritchie, L.T.; Johnson, J.D.; Blond, R.M.

    1983-02-01

    The CRAC2 computer code is a revision of the Calculation of Reactor Accident Consequences computer code, CRAC, developed for the Reactor Safety Study. The CRAC2 computer code incorporates significant modeling improvements in the areas of weather sequence sampling and emergency response, and refinements to the plume rise, atmospheric dispersion, and wet deposition models. New output capabilities have also been added. This guide is to facilitate the informed and intelligent use of CRAC2. It includes descriptions of the input data, the output results, the file structures, control information, and five sample problems.

  19. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    SciTech Connect

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural, technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective

  20. Analysis of Three Mile Island-Unit 2 accident

    SciTech Connect

    Not Available

    1980-03-01

    The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

  1. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  2. Traffic accident analysis using GIS: a case study of Kyrenia City

    NASA Astrophysics Data System (ADS)

    Kara, Can; Akçit, Nuhcan

    2015-06-01

    Traffic accidents are causing major deaths in urban environments, so analyzing locations of the traffic accidents and their reasons is crucial. In this manner, patterns of accidents and hotspot distribution are analyzed by using geographic information technology. Locations of the traffic accidents in the years 2011, 2012 and 2013 are combined to generate the kernel distribution map of Kyrenia City. This analysis aims to find high dense intersections and segments within the city. Additionally, spatial autocorrelation methods Local Morans I and Getis-Ord Gi are employed . The results are discussed in detail for further analysis. Finally, required changes for numerous intersections are suggested to decrease potential risks of high dense accident locations.

  3. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  4. Extension of ship accident analysis to multiple-package shipments

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.

    1997-11-01

    Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10`s or 100`s of individual packagings is compromised. The previous analysis involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship`s bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence.

  5. [Personal computer interactive algorithm for estimating radiologic contamination and doses after a nuclear accident in Europe].

    PubMed

    Tabet, E

    2001-01-01

    The algorithm RANA (radiological assessment of nuclear accidents) is a tool which can be exploited to estimate the space and time structure of the radiological consequences of a radioactive release following a nuclear accident in Europe. The algorithm, formulated in the language of Mathematica, can be run on a personal computer. It uses simplified physical assumptions as for the the diffusion of the cloud and the transfer of the contamination to the food chain. The user gets the needed information by means of interactive windows that allow a fast evaluation of dose and contamination profiles. Calculations are performed either starting from the source terms or from the knowledge of experimental contamination data. Radiological consequences, such as individual or collective doses from several exposure paths, are parametrized in terms of the atmospheric diffusion categories. PMID:11758278

  6. A general approach to critical infrastructure accident consequences analysis

    NASA Astrophysics Data System (ADS)

    Bogalecka, Magda; Kołowrocki, Krzysztof; Soszyńska-Budny, Joanna

    2016-06-01

    The probabilistic general model of critical infrastructure accident consequences including the process of the models of initiating events generated by its accident, the process of environment threats and the process of environment degradation is presented.

  7. PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    , K; Jonathan Lowrie, J; David Thoman , D; Austin Keller , A

    2008-07-30

    Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

  8. Offsite radiological consequence analysis for the bounding aircraft crash accident

    SciTech Connect

    OBERG, B.D.

    2003-03-22

    The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

  9. Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.

    PubMed

    Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

    2014-04-01

    According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

  10. Analysis of surface powered haulage accidents, January 1990--July 1996

    SciTech Connect

    Fesak, G.M.; Breland, R.M.; Spadaro, J.

    1996-12-31

    This report addresses surface haulage accidents that occurred between January 1990 and July 1996 involving haulage trucks (including over-the-road trucks), front-end-loaders, scrapers, utility trucks, water trucks, and other mobile haulage equipment. The study includes quarries, open pits and surface coal mines utilizing self-propelled mobile equipment to transport personnel, supplies, rock, overburden material, ore, mine waste, or coal for processing. A total of 4,397 accidents were considered. This report summarizes the major factors that led to the accidents and recommends accident prevention methods to reduce the frequency of these accidents.

  11. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  12. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    SciTech Connect

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR.

  13. Systemic analysis of so-called 'accidents on the level' in a multi trade company.

    PubMed

    Leclercq, S; Thouy, S

    2004-10-10

    Slips, trips and falls on the level are considered commonplace and are rarely subjected to in-depth analysis. They occur in highly varied circumstances in an occupational situation. In-depth analysis of these accidents was conducted within a company with the aim of understanding them better, to be able to discuss prevention field possibilities and priorities for the company concerned. Firstly, available data on 'accidents on the level' occurring over the last 4 years were analysed and a typology for these accidents was derived, based on individual activity at the time of the accident and accident location. The three most serious accident-causing situations were analysed in-depth from interviews with injured persons, as well as from activity observation and activity-related verbal information obtained from operatives. These most serious situations involved accidents occurring when climbing down from trucks or when walking either in surroundings outside company premises or from (to) a vehicle to (from) a work location. In-depth accident analysis and characterization of accident-causing situations as a whole enhance our understanding of the accident process and allow us to envisage priorities for action in the prevention field, in operational terms. Each accident-causing situation reveals environmental factors that in fact constitute accident factors (obstacle, stone, etc.), when the individual walks or climbs down from a truck. Analysis shows that other events are necessary for accident occurrence. For example, the individual may be subjected to a time constraint or may be preoccupied. Results obtained here, in a company integrating different trades, are discussed and compared with those referred to in the literature. Generalization of some of these results is also considered. PMID:15370848

  14. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  15. Linguistic methodology for the analysis of aviation accidents

    NASA Technical Reports Server (NTRS)

    Goguen, J. A.; Linde, C.

    1983-01-01

    A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

  16. Aircraft Accident Prevention: Loss-of-Control Analysis

    NASA Technical Reports Server (NTRS)

    Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

    2009-01-01

    The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

  17. Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident

    PubMed Central

    Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

    2012-01-01

    In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms. PMID:22496858

  18. Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System

    SciTech Connect

    WILLIAMS, J.C.

    2000-09-15

    Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

  19. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    NASA Astrophysics Data System (ADS)

    Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

    2001-05-01

    Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  20. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Gomez del Rio, J; Sanz, J

    2000-02-23

    Previous studies of the safety and environmental (S and E) aspects of the HYLIFE-II inertial fusion energy (IFE) power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work a set of computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) has been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here the authors consider a severe lost of coolant accident (LOCA) producing simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the containment) and of the two barriers surrounding the chamber (inner shielding and containment building it self). Even though containment failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product release and transport. The results of these calculations show that the estimated off-site dose is less than 6 mSv (0.6 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  1. The accident analysis of mobile mine machinery in Indian opencast coal mines.

    PubMed

    Kumar, R; Ghosh, A K

    2014-01-01

    This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines. PMID:23324038

  2. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    SciTech Connect

    Su'ud, Zaki; Anshari, Rio

    2012-06-06

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  3. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  4. Otorhinolaryngologic disorders and diving accidents: an analysis of 306 divers.

    PubMed

    Klingmann, Christoph; Praetorius, Mark; Baumann, Ingo; Plinkert, Peter K

    2007-10-01

    Diving is a very popular leisure activity with an increasing number of participants. As more than 80% of the diving related problems involve the head and neck region, every otorhinolaryngologist should be familiar with diving medical standards. We here present an analysis of more than 300 patients we have treated in the past four years. Between January 2002 and October 2005, 306 patients presented in our department with otorhinological disorders after diving, or after diving accidents. We collected the following data: name, sex, age, date of treatment, date of accident, diagnosis, special aspects of the diagnosis, number of dives, diving certification, whether and which surgery had been performed, history of acute diving accidents or follow up treatment, assessment of fitness to dive and special remarks. The study setting was a retrospective cohort study. The distribution of the disorders was as follows: 24 divers (8%) with external ear disorders, 140 divers (46%) with middle ear disorders, 56 divers (18%) with inner ear disorders, 53 divers (17%) with disorders of the nose and sinuses, 24 divers (8%) with decompression illness (DCI) and 9 divers (3%) who complained of various symptoms. Only 18% of the divers presented with acute disorders. The most common disorder (24%) was Eustachian tube dysfunction. Female divers were significantly more often affected. Chronic sinusitis was found to be associated with a significantly higher number of performed dives. Conservative treatment failed in 30% of the patients but sinus surgery relieved symptoms in all patients of this group. The middle ear is the main problem area for divers. Middle ear ventilation problems due to Eustachian tube dysfunction can be treated conservatively with excellent results whereas pathology of the tympanic membrane and ossicular chain often require surgery. More than four out of five patients visited our department to re-establish their fitness to dive. Although the treatment of acute diving

  5. MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents

    SciTech Connect

    Foppe, T.L.; Peterson, V.L.

    1993-10-01

    The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

  6. Thermal analysis of an irradiated-fuel concrete integrated container under normal and fire-accident conditions. Report No. 89-242-K

    SciTech Connect

    Taralis, D.

    1990-01-01

    This study describes the development of the special purpose three-dimensional heat transfer computer code for the thermal analysis of a Concrete Integrated Container (CIC) for the transportation of 10-year cooled fuel under normal conditions and hypothetical fire accident conditions. Results are given for: Comparisons of theoretical predictions with existing half-scale CIC experimental results, and representative analytical results for full-scale CIC under normal and fire accident conditions.

  7. Computation of cross sections and dose conversion factors for criticality accident dosimetry.

    PubMed

    Devine, R T

    2004-01-01

    In the application of criticality accident dosemeters the cross sections and fluence-to-dose conversion factors have to be computed. The cross section and fluence-to-dose conversion factor for the thermal and epi-thermal contributions to neutron dose are well documented; for higher energy regions (>100 keV) these depend on the spectrum assumed. Fluence is determined using threshold detectors. The cross sections require the folding of an expected spectrum with the reaction cross sections. The fluence-to-dose conversion factors also require a similar computation. The true and effective thresholds are used to include the information on the expected spectrum. The spectra can either be taken from compendia or measured at the facility at which the exposures are to be expected. The cross sections can be taken from data computations or analytic representations and the fluence-to-dose conversion factors are determined by various standards making bodies. The problem remaining is the method of computation. The purpose of this paper is to compare two methods for computing these factors: analytic and Monte Carlo. PMID:15353697

  8. GPHS-RTG launch accident analysis for Galileo and Ulysses

    SciTech Connect

    Bradshaw, C.T. )

    1991-01-01

    This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. The National Aeronautics and Space Administration (NASA) provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

  9. An analysis of three weather-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Fujita, T. T.; Caracena, F.

    1977-01-01

    Two aircraft accidents in 1975, one at John F. Kennedy International Airport in New York City on 24 June and the other at Stapleton International Airport in Denver on 7 August, were examined in detail. A third accident on 23 June 1976 at Philadelphia International Airport is being investigated. Amazingly, there was a spearhead echo just to the north of each accident site. The echoes formed from 5 to 50 min in advance of the accident and moved faster than other echoes in the vicinity. These echoes were photographed by National Weather Service radars, 130-205 km away. At closer ranges, however, one or more circular echoes were depicted by airborne and ground radars. These cells were only 3-5 km in diameter, but they were accompanied by downdrafts of extreme intensity, called downbursts. All accidents occurred as aircraft, either descending or climbing, lost altitude while experiencing strong wind shear inside downburst cells.

  10. A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS

    SciTech Connect

    Palmrose, D E; Yang, J M

    2007-05-10

    The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

  11. Progress in accident analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Latkowski, J F; Gomez del Rio, J; Sanz, J

    2000-10-11

    The present work continues our effort to perform an integrated safety analysis for the HYLIFE-II inertial fusion energy (IFE) power plant design. Recently we developed a base case for a severe accident scenario in order to calculate accident doses for HYLIFE-II. It consisted of a total loss of coolant accident (LOCA) in which all the liquid flibe (Li{sub 2}BeF{sub 4}) was lost at the beginning of the accident. Results showed that the off-site dose was below the limit given by the DOE Fusion Safety Standards for public protection in case of accident, and that his dose was dominated by the tritium released during the accident.

  12. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  13. GASFLOW: A computational model to analyze accidents in nuclear containment and facility buildings

    SciTech Connect

    Travis, J.R. ); Nichols, B.D.; Wilson, T.L.; Lam, K.L.; Spore, J.W.; Niederauer, G.F. )

    1993-01-01

    GASFLOW is a finite-volume computer code that solves the time-dependent, compressible Navier-Stokes equations for multiple gas species. The fluid-dynamics algorithm is coupled to the chemical kinetics of combusting liquids or gases to simulate diffusion or propagating flames in complex geometries of nuclear containment or confinement and facilities' buildings. Fluid turbulence is calculated to enhance the transport and mixing of gases in rooms and volumes that may be connected by a ventilation system. The ventilation system may consist of extensive ductwork, filters, dampers or valves, and fans. Condensation and heat transfer to walls, floors, ceilings, and internal structures are calculated to model the appropriate energy sinks. Solid and liquid aerosol behavior is simulated to give the time and space inventory of radionuclides. The solution procedure of the governing equations is a modified Los Alamos ICE'd-ALE methodology. Complex facilities can be represented by separate computational domains (multiblocks) that communicate through overlapping boundary conditions. The ventilation system is superimposed throughout the multiblock mesh. Gas mixtures and aerosols are transported through the free three-dimensional volumes and the restricted one-dimensional ventilation components as the accident and fluid flow fields evolve. Combustion may occur if sufficient fuel and reactant or oxidizer are present and have an ignition source. Pressure and thermal loads on the building, structural components, and safety-related equipment can be determined for specific accident scenarios. GASFLOW calculations have been compared with large oil-pool fire tests in the 1986 HDR containment test T52.14, which is a 3000-kW fire experiment. The computed results are in good agreement with the observed data.

  14. Analysis of construction accidents in Turkey and responsible parties.

    PubMed

    Gürcanli, G Emre; Müngen, Uğur

    2013-01-01

    Construction is one of the world's biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972-2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00-12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  15. Analysis of Construction Accidents in Turkey and Responsible Parties

    PubMed Central

    GÜRCANLI, G. Emre; MÜNGEN, Uğur

    2013-01-01

    Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  16. Thermal analysis of the 10-gallon and the 55-gallon DOT-6M containers with thermal boundary conditions corresponding to 10CFR71 normal transport and accident conditions

    SciTech Connect

    Sanchez, L.C.; Longenbaugh, R.S.; Moss, M.; Haseman, G.M.; Fowler, W.E.; Roth, E.P.

    1988-03-01

    This report describes the heat transfer analysis of the 10-gallon and 55-gallon 6M containers. The analysis was performed with boundary conditions corresponding to a normal transport condition and a hypothetical accident condition. Computational results indicated that the insulation material in the 6M containers will adequately protect the payload region of the 6M containers. 26 refs., 26 figs., 8 tabs.

  17. A method for modeling and analysis of directed weighted accident causation network (DWACN)

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Ding, Jing

    2015-11-01

    Using complex network theory to analyze accidents is effective to understand the causes of accidents in complex systems. In this paper, a novel method is proposed to establish directed weighted accident causation network (DWACN) for the Rail Accident Investigation Branch (RAIB) in the UK, which is based on complex network and using event chains of accidents. DWACN is composed of 109 nodes which denote causal factors and 260 directed weighted edges which represent complex interrelationships among factors. The statistical properties of directed weighted complex network are applied to reveal the critical factors, the key event chains and the important classes in DWACN. Analysis results demonstrate that DWACN has characteristics of small-world networks with short average path length and high weighted clustering coefficient, and display the properties of scale-free networks captured by that the cumulative degree distribution follows an exponential function. This modeling and analysis method can assist us to discover the latent rules of accidents and feature of faults propagation to reduce accidents. This paper is further development on the research of accident analysis methods using complex network.

  18. Accident investigation: Analysis of aircraft motions from ATC radar recordings

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1976-01-01

    A technique was developed for deriving time histories of an aircraft's motion from air traffic control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data (from an onboard Mode-C transponder), to derive an expanded set of data which includes airspeed, lift, thrust-drag, attitude angles (pitch, roll, and heading), etc. This method of analyzing aircraft motions was evaluated through flight experiments which used the CV-990 research aircraft and recordings from both the enroute and terminal ATC radar systems. The results indicate that the values derived from the ATC radar records are for the most part in good agreement with the corresponding values obtained from airborne measurements. In an actual accident, this analysis of ATC radar records can complement the flight-data recorders, now onboard airliners, and provide a source of recorded information for other types of aircraft that are equipped with Mode-C transponders but not with onboard recorders.

  19. Analysis of Loss-of-Coolant Accidents in the NBSR

    SciTech Connect

    Baek J. S.; Cheng L.; Diamond, D.

    2014-05-23

    This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

  20. Offsite Radiological Consequence Analysis for the Bounding Flammable Gas Accident

    SciTech Connect

    CARRO, C.A.

    2003-07-30

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a single-shell tank The calculation applies reasonably conservation input parameters in accordance with DOE-STD-3009, Appendix A, guidance. Revision 1 incorporates comments received from Office of River Protection.

  1. Accidents at work and costs analysis: a field study in a large Italian company.

    PubMed

    Battaglia, Massimo; Frey, Marco; Passetti, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  2. Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company

    PubMed Central

    BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  3. Development of the simulation system {open_quotes}IMPACT{close_quotes} for analysis of nuclear power plant severe accidents

    SciTech Connect

    Naitoh, Masanori; Ujita, Hiroshi; Nagumo, Hiroichi

    1997-07-01

    The Nuclear Power Engineering Corporation (NUPEC) has initiated a long-term program to develop the simulation system {open_quotes}IMPACT{close_quotes} for analysis of hypothetical severe accidents in nuclear power plants. IMPACT employs advanced methods of physical modeling and numerical computation, and can simulate a wide spectrum of senarios ranging from normal operation to hypothetical, beyond-design-basis-accident events. Designed as a large-scale system of interconnected, hierarchical modules, IMPACT`s distinguishing features include mechanistic models based on first principles and high speed simulation on parallel processing computers. The present plan is a ten-year program starting from 1993, consisting of the initial one-year of preparatory work followed by three technical phases: Phase-1 for development of a prototype system; Phase-2 for completion of the simulation system, incorporating new achievements from basic studies; and Phase-3 for refinement through extensive verification and validation against test results and available real plant data.

  4. Computational Aerodynamics of Shuttle Orbiter Damage Scenarios in Support of the Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    Bibb, Karen L.; Prabhu, Ramadas K.

    2004-01-01

    In support of the Columbia Accident Investigation, inviscid computations of the aerodynamic characteristics for various Shuttle Orbiter damage scenarios were performed using the FELISA unstructured CFD solver. Computed delta aerodynamics were compared with the reconstructed delta aerodynamics in order to postulate a progression of damage through the flight trajectory. By performing computations at hypervelocity flight and CF4 tunnel conditions, a bridge was provided between wind tunnel testing in Langley's 20-Inch CF4 facility and the flight environment experienced by Columbia during re-entry. The rapid modeling capability of the unstructured methodology allowed the computational effort to keep pace with the wind tunnel and, at times, guide the wind tunnel efforts. These computations provided a detailed view of the flowfield characteristics and the contribution of orbiter components (such as the vertical tail and wing) to aerodynamic forces and moments that were unavailable from wind tunnel testing. The damage scenarios are grouped into three categories. Initially, single and multiple missing full RCC panels were analyzed to determine the effect of damage location and magnitude on the aerodynamics. Next is a series of cases with progressive damage, increasing in severity, in the region of RCC panel 9. The final group is a set of wing leading edge and windward surface deformations that model possible structural deformation of the wing skin due to internal heating of the wing structure. By matching the aerodynamics from selected damage scenarios to the reconstructed flight aerodynamics, a progression of damage that is consistent with the flight data, debris forensics, and wind tunnel data is postulated.

  5. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  6. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    SciTech Connect

    Kohout, E.F.; Folga, S.; Mueller, C.; Nabelssi, B.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.

  7. Safety analysis results for cryostat ingress accidents in ITER

    SciTech Connect

    Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

    1997-06-01

    Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits. 6 refs., 2 figs., 1 tab.

  8. Safety analysis results for cryostat ingress accidents in ITER

    SciTech Connect

    Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

    1996-12-31

    Accidents involving the ingress of air or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

  9. Safety Analysis Results for Cryostat Ingress Accidents in ITER

    NASA Astrophysics Data System (ADS)

    Merrill, B. J.; Cadwallader, L. C.; Petti, D. A.

    1997-06-01

    Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

  10. Sensitivity analysis in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1984-01-01

    Information on sensitivity analysis in computational aerodynamics is given in outline, graphical, and chart form. The prediction accuracy if the MCAERO program, a perturbation analysis method, is discussed. A procedure for calculating perturbation matrix, baseline wing paneling for perturbation analysis test cases and applications of an inviscid sensitivity matrix are among the topics covered.

  11. Chiropractic treatment of patients in motor vehicle accidents: a statistical analysis

    PubMed Central

    Dies, Stephen; Strapp, J Walter

    1992-01-01

    Motor vehicle accidents (MVA) are a major cause of spinal injuries treated by chiropractors. In this study the files of one chiropractor were reviewed retrospectively to generate a data base on the MVA cases (n = 149). The effect of age, sex, vehicle damage, symptoms and concurrent physiotherapy on the dependent variables of number of treatments, improvement and requirement for ongoing treatment was computed using an analysis of variance. Overall the average number of treatments given was 14.2. Patients who complained of headache or low back pain required more treatments than average. Improvement level was lowered by delay in seeking treatment, the presence of uncomplicated nausea and advancing age. Ongoing treatment to relieve persistent pain was required in 40.2 percent of the cases. None of the factors studied had a significant effect on this variable. The results of this study are comparable to those reported in the medical literature.

  12. Conceptual design loss-of-coolant accident analysis for the Advanced Neutron Source reactor

    SciTech Connect

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L. Jr. )

    1994-01-01

    A RELAP5 system model for the Advanced Neutron Source Reactor has been developed for performing conceptual safety analysis report calculations. To better represent thermal-hydraulic behavior of the core, three specific changes in the RELAP5 computer code were implemented: a turbulent forced-convection heat transfer correlation, a critical heat flux (CHF) correlation, and an interfacial drag correlation. The model consists of the core region, the heat exchanger loop region, and the pressurizing/letdown system region. Results for three loss-of-coolant accident analyses are presented: (1) an instantaneous double-ended guillotine (DEG) core outlet break with a cavitating venturi installed downstream of the core, (b) a core pressure boundary tube outer wall rupture, and (c) a DEG core inlet break with a finite break-formation time. The results show that the core can survive without exceeding the flow excursion of CHF thermal limits at a 95% probability level if the proper mitigation options are provided.

  13. Action Plan for updated Chapter 15 Accident Analysis in the SRS Production Reactor SAR

    SciTech Connect

    Hightower, N.T. III; Burnett, T.W.

    1989-11-15

    This report describes the Action Plan for the upgrade of the Chapter 15 Accident Analysis in the SRS Production Reactor SAR required for K-Restart. This Action Plan will be updated periodically to reflect task accomplishments and issue resolutions.

  14. School sports accidents: analysis of causes, modes, and frequencies.

    PubMed

    Kelm, J; Ahlhelm, F; Pape, D; Pitsch, W; Engel, C

    2001-01-01

    About 5% of all school children are seriously injured during physical education every year. Because of its influence on children's attitude toward sports and the economic aspects, an evaluation of causes and medical consequences is necessary. In this study, 213 school sports accidents were investigated. Besides diagnosis, the localization of injuries, as well as the duration of the sick leave were documented. Average age of injured students was 13 years. Most of the injured students blamed themselves for the accident. The most common injuries were sprains, contusions, and fractures. Main reasons for the accidents were faults in basic motion training. Playing soccer and basketball were the most frequent reasons for injuries. The upper extremity was more frequently involved than the lower extremity. Sports physicians and teachers should work out a program outlining the individual needs and capabilities of the injured students to reintegrate them into physical education. PMID:11242243

  15. Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

    2005-01-01

    NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

  16. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    NASA Technical Reports Server (NTRS)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  17. Computer analysis of railcar vibrations

    NASA Technical Reports Server (NTRS)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  18. ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS

    SciTech Connect

    WILLIAMS, J.C.

    2003-11-15

    This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

  19. Exploring the use of computer games and virtual reality in exposure therapy for fear of driving following a motor vehicle accident.

    PubMed

    Walshe, David G; Lewis, Elizabeth J; Kim, Sun I; O'Sullivan, Kathleen; Wiederhold, Brenda K

    2003-06-01

    Specific phobia, situational type-driving, induced by accident (accident phobia) occurs in 18-38% of those involved in a vehicular accident of sufficient severity to warrant referral to the emergency departments of a general hospital. The objective is to investigate, in an open study, the effectiveness of the combined use of computer generated environments involving driving games (game reality [GR]) and a virtual reality (VR) driving environment in exposure therapy for the treatment of driving phobia following a motor vehicle accident (MVA) program. Fourteen subjects who met DSM-IV criteria for Simple Phobia/Accident Phobia and were referred from the emergency department of a general hospital were exposed to a Virtual Driving Environment (Hanyang University Driving Phobia Environment) and computer driving games (London Racer/Midtown Madness/Rally Championship). Patients who experienced "immersion" (i.e., a sense of presence with heightened anxiety) in one of the driving simulations (defined as an increase in SUD ratings of 3 and/or an increase of heart rate > 15 BPM in a 1-h trial session of computer simulation driving) were exposed to a cognitive behavioral program of up to 12 1-h sessions involving graded driving simulation tasks with self-monitoring, physiological feedback, diaphragmatic breathing and cognitive reappraisal. Subjects were assessed at the beginning and end of therapy with measurements of: physiological responsivity (heart rate), subjective ratings of distress (SUD), rating scales for severity of fear of driving (FDI), Posttraumatic Stress Disorder (CAPS) and depression (HAM-D) and achievement of target behaviors. Of all patients 7/14 (50%) became immersed in the driving environments. This immersed group (n = 7) completed the exposure program. Pre- and post-treatment comparisons showed significant post treatment reductions on all measures SUDS (p = 0.008), FDI (p = 0.008), CAPS (p = 0.008), HR (p = 0.008), CAPS (p = 0.008), HAM-D (p = 0

  20. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    SciTech Connect

    Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

    2012-09-30

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

  1. Accident sequence precursor analysis level 2/3 model development

    SciTech Connect

    Lui, C.H.; Galyean, W.J.; Brownson, D.A.

    1997-02-01

    The US Nuclear Regulatory Commission`s Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models.

  2. Advanced accident sequence precursor analysis level 2 models

    SciTech Connect

    Galyean, W.J.; Brownson, D.A.; Rempe, J.L.

    1996-03-01

    The U.S. Nuclear Regulatory Commission Accident Sequence Precursor program pursues the ultimate objective of performing risk significant evaluations on operational events (precursors) occurring in commercial nuclear power plants. To achieve this objective, the Office of Nuclear Regulatory Research is supporting the development of simple probabilistic risk assessment models for all commercial nuclear power plants (NPP) in the U.S. Presently, only simple Level 1 plant models have been developed which estimate core damage frequencies. In order to provide a true risk perspective, the consequences associated with postulated core damage accidents also need to be considered. With the objective of performing risk evaluations in an integrated and consistent manner, a linked event tree approach which propagates the front end results to back end was developed. This approach utilizes simple plant models that analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude and timing of a radioactive release to the environment, and calculate the consequences for a given release. Detailed models and results from previous studies, such as the NUREG-1150 study, are used to quantify these simple models. These simple models are then linked to the existing Level 1 models, and are evaluated using the SAPHIRE code. To demonstrate the approach, prototypic models have been developed for a boiling water reactor, Peach Bottom, and a pressurized water reactor, Zion.

  3. Approaches to accident analysis in recent US Department of Energy environmental impact statements

    SciTech Connect

    Mueller, C.; Folga, S.; Nabelssi, B.

    1996-12-31

    A review of accident analyses in recent US Department of Energy (DOE) Environmental Impact Statements (EISs) was conducted to evaluate the consistency among approaches and to compare these approaches with existing DOE guidance. The review considered several components of an accident analysis: the overall scope, which in turn should reflect the scope of the EIS; the spectrum of accidents considered; the methods and assumptions used to determine frequencies or frequency ranges for the accident sequences; and the assumption and technical bases for developing radiological and chemical atmospheric source terms and for calculating the consequences of airborne releases. The review also considered the range of results generated with respect to impacts on various worker and general populations. In this paper, the findings of these reviews are presented and methods recommended for improving consistency among EISs and bringing them more into line with existing DOE guidance.

  4. Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

    1996-12-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

  5. BESAFE II: Accident safety analysis code for MFE reactor designs

    NASA Astrophysics Data System (ADS)

    Sevigny, Lawrence Michael

    The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications

  6. Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

  7. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  8. Injury patterns of seniors in traffic accidents: A technical and medical analysis

    PubMed Central

    Brand, Stephan; Otte, Dietmar; Mueller, Christian Walter; Petri, Maximilian; Haas, Philipp; Stuebig, Timo; Krettek, Christian; Haasper, Carl

    2012-01-01

    AIM: To investigate the actual injury situation of seniors in traffic accidents and to evaluate the different injury patterns. METHODS: Injury data, environmental circumstances and crash circumstances of accidents were collected shortly after the accident event at the scene. With these data, a technical and medical analysis was performed, including Injury Severity Score, Abbreviated Injury Scale and Maximum Abbreviated Injury Scale. The method of data collection is named the German In-Depth Accident Study and can be seen as representative. RESULTS: A total of 4430 injured seniors in traffic accidents were evaluated. The incidence of sustaining severe injuries to extremities, head and maxillofacial region was significantly higher in the group of elderly people compared to a younger age (P < 0.05). The number of accident-related injuries was higher in the group of seniors compared to other groups. CONCLUSION: Seniors are more likely to be involved in traffic injuries and to sustain serious to severe injuries compared to other groups. PMID:23173111

  9. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery

    PubMed Central

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005–2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689

  10. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery.

    PubMed

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005-2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689

  11. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    PubMed Central

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  12. [Comparative analysis of the radionuclide composition in fallout after the Chernobyl and the Fukushima accidents].

    PubMed

    Kotenko, K V; Shinkarev, S M; Abramov, Iu V; Granovskaia, E O; Iatsenko, V N; Gavrilin, Iu I; Margulis, U Ia; Garetskaia, O S; Imanaka, T; Khoshi, M

    2012-01-01

    The nuclear accident occurred at Fukushima Dai-ichi Nuclear Power Plant (NPP) (March 11, 2011) similarly to the accident at the Chernobyl NPP (April 26, 1986) is related to the level 7 of the INES. It is of interest to make an analysis of the radionuclide composition of the fallout following the both accidents. The results of the spectrometric measurements were used in that comparative analysis. Two areas following the Chernobyl accident were considered: (1) the near zone of the fallout - the Belarusian part of the central spot extended up to 60 km around the Chernobyl NPS and (2) the far zone of the fallout--the "Gomel-Mogilev" spot centered 200 km to the north-northeast of the damaged reactor. In the case of Fukushima accident the near zone up to about 60 km considered. The comparative analysis has been done with respect to refractory radionuclides (95Zr, 95Nb, 141Ce, 144Ce), as well as to the intermediate and volatile radionuclides 103Ru, 106Ru, 131I, 134Cs, 137Cs, 140La, 140Ba and the results of such a comparison have been discussed. With respect to exposure to the public the most important radionuclides are 131I and 137Cs. For the both accidents the ratios of 131I/137Cs in the considered soil samples are in the similar ranges: (3-50) for the Chernobyl samples and (5-70) for the Fukushima samples. Similarly to the Chernobyl accident a clear tendency that the ratio of 131I/137Cs in the fallout decreases with the increase of the ground deposition density of 137Cs within the trace related to a radioactive cloud has been identified for the Fukushima accident. It looks like this is a universal tendency for the ratio of 131I/137Cs versus the 137Cs ground deposition density in the fallout along the trace of a radioactive cloud as a result of a heavy accident at the NPP with radionuclides releases into the environment. This tendency is important for an objective reconstruction of 131I fallout based on the results of 137Cs measurements of soil samples carried out at

  13. Application of Latin hypercube sampling to RADTRAN 4 truck accident risk sensitivity analysis

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.; Kanipe, F.L.

    1994-12-31

    The sensitivity of calculated dose estimates to various RADTRAN 4 inputs is an available output for incident-free analysis because the defining equations are linear and sensitivity to each variable can be calculated in closed mathematical form. However, the necessary linearity is not characteristic of the equations used in calculation of accident dose risk, making a similar tabulation of sensitivity for RADTRAN 4 accident analysis impossible. Therefore, a study of sensitivity of accident risk results to variation of input parameters was performed using representative routes, isotopic inventories, and packagings. It was determined that, of the approximately two dozen RADTRAN 4 input parameters pertinent to accident analysis, only a subset of five or six has significant influence on typical analyses or is subject to random uncertainties. These five or six variables were selected as candidates for Latin Hypercube Sampling applications. To make the effect of input uncertainties on calculated accident risk more explicit, distributions and limits were determined for two variables which had approximately proportional effects on calculated doses: Pasquill Category probability (PSPROB) and link population density (LPOPD). These distributions and limits were used as input parameters to Sandia`s Latin Hypercube Sampling code to generate 50 sets of RADTRAN 4 input parameters used together with point estimates of other necessary inputs to calculate 50 observations of estimated accident dose risk.Tabulations of the RADTRAN 4 accident risk input variables and their influence on output plus illustrative examples of the LHS calculations, for truck transport situations that are typical of past experience, will be presented .

  14. Computer vision in microstructural analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  15. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  16. MELCOR code analysis of a severe accident LOCA at Peach Bottom Plant

    SciTech Connect

    Carbajo, J.J. )

    1993-01-01

    A design-basis loss-of-coolant accident (LOCA) concurrent with complete loss of the emergency core cooling systems (ECCSs) has been analyzed for the Peach Bottom atomic station unit 2 using the MELCOR code, version 1.8.1. The purpose of this analysis is to calculate best-estimate times for the important events of this accident sequence and best-estimate source terms. Calculated pressures and temperatures at the beginning of the transient have been compared to results from the Peach Bottom final safety analysis report (FSAR). MELCOR-calculated source terms have been compared to source terms reported in the NUREG-1465 draft.

  17. Fission product transport analysis in a loss of decay heat removal accident at Browns Ferry

    SciTech Connect

    Wichner, R.P.; Weber, C.F.; Hodge, S.A.; Beahm, E.C.; Wright, A.L.

    1984-01-01

    This paper summarizes an analysis of the movement of noble gases, iodine, and cesium fission products within the Mark-I containment BWR reactor system represented by Browns Ferry Unit 1 during a postulated accident sequence initiated by a loss of decay heat removal (DHR) capability following a scram. The event analysis showed that this accident could be brought under control by various means, but the sequence with no operator action ultimately leads to containment (drywell) failure followed by loss of water from the reactor vessel, core degradation due to overheating, and reactor vessel failure with attendant movement of core debris onto the drywell floor.

  18. A computational tool based on voxel geometry for dose reconstruction of a radiological accident due to external exposure.

    PubMed

    Lemosquet, A; Clairand, I; de Carlan, L; Franck, D; Aubineau-Lanièce, I; Bottollier-Depois, J-F

    2004-01-01

    In the case of overexposure to ionising radiation, estimation of the absorbed dose in the organism is an important indicator for evaluating the biological consequences of this exposure. The physical dosimetry approach is based either on real reconstruction of the accident, using physical phantoms, or on calculation techniques. Tools using Monte Carlo simulations associated with geometric models are very powerful since they offer the possibility to simulate faithfully the victim and the environment for dose calculations in various accidental situations. Their work presents a new computational tool, called SESAME, dedicated to dose reconstruction of radiological accidents based on anthropomorphic voxel phantoms built from real medical images of the victim in association with the MCNP Monte Carlo code. The utility was, as a first step, validated for neutrons by experimental means using a physical tissue-equivalent phantom. PMID:15353689

  19. RELAP5 Application to Accident Analysis of the NIST Research Reactor

    SciTech Connect

    Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

    2012-03-18

    Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

  20. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    SciTech Connect

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

  1. Computer program predicts thermal and flow transients experienced in a reactor loss- of-flow accident

    NASA Technical Reports Server (NTRS)

    Hale, C. J.

    1967-01-01

    Program analyzes the consequences of a loss-of-flow accident in the primary cooling system of a heterogeneous light-water moderated and cooled nuclear reactor. It produces a temperature matrix 36 x 41 /x,y/ which includes fuel surface temperatures relative to the time the pump power was lost.

  2. Analysis of fission product release behavior during the TMI-2 accident

    SciTech Connect

    Petti, D. A.; Adams, J. P.; Anderson, J. L.; Hobbins, R. R.

    1987-01-01

    An analysis of fission product release during the Three Mile Island Unit 2 (TMI-2) accident has been initiated to provide an understanding of fission product behavior that is consistent with both the best estimate accident scenario and fission product results from the ongoing sample acquisition and examination efforts. ''First principles'' fission product release models are used to describe release from intact, disrupted, and molten fuel. Conclusions relating to fission product release, transport, and chemical form are drawn. 35 refs., 12 figs., 7 tabs.

  3. RADTRAN 5: A computer code for transportation risk analysis

    SciTech Connect

    Neuhauser, K. S.; Kanipe, F. L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air.

  4. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  5. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  6. Analysis of concrete containment structures under severe accident loading conditions

    SciTech Connect

    Porter, V.L.

    1993-12-31

    One of the areas of current interest in the nuclear power industry is the response of containment buildings to internal pressures that may exceed design pressure levels. Evaluating the response of structures under these conditions requires computing beyond design load to the ultimate load of the containment. For concrete containments, this requirement means computing through severe concrete cracking and into the regime of wide-spread plastic rebar and/or tendon response. In this regime of material response, an implicit code can have trouble converging. This paper describes some of the author`s experiences with Version 5.2 of ABAQUS Standard and the ABAQUS concrete model in computing the axisymmetric response of a prestressed concrete containment to ultimate global structural failure under high internal pressures. The effects of varying the tension stiffening parameter in the concrete material model and variations of the parameters for the CONTROLS option are discussed.

  7. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  8. SACO-1: a fast-running LMFBR accident-analysis code

    SciTech Connect

    Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

    1980-01-01

    SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

  9. Computational Assessment of the GT-MHR Graphite Core Support Structural Integrity in Air-Ingress Accident Condition

    SciTech Connect

    Jong B. Lim; Eung S. Kim; Chang H. Oh; Richard R. Schultz; David A. Petti

    2008-10-01

    The objective of this project was to perform stress analysis for graphite support structures of the General Atomics’ 600 MWth GT-MHR prismatic core design using ABAQUS ® (ver. 6.75) to assess their structural integrity in air-ingress accident conditions where the structure weakens over time due to oxidation damages. The graphite support structures of prismatic type GT-MHR was analyzed based on the change of temperature, burn-off and corrosion depth during the accident period predicted by GAMMA, a multi-dimensional gas multi-component mixture analysis code developed in the Republic of Korea (ROK)/United States (US) International –Nuclear Engineering Research Initiative (I-NERI) project. Both the loading and thermal stresses were analyzed, but the thermal stress was not significant, leaving the loading stress to be the major factor. The mechanical strengths are exceeded between 11 to 11.5 days after loss-of-coolant-accident (LOCA), corresponding to 5.5 to 6 days after the start of natural convection.

  10. Analysis of Accidents at the Pakistan Research Reactor-1 Using Proposed Mixed-Fuel (HEU and LEU) Core

    SciTech Connect

    Bokhari, Ishtiaq H.

    2004-12-15

    The Pakistan Research Reactor-1 (PARR-1) was converted from highly enriched uranium (HEU) to low-enriched uranium (LEU) fuel in 1991. The reactor is running successfully, with an upgraded power level of 10 MW. To save money on the purchase of costly fresh LEU fuel elements, the use of less burnt HEU spent fuel elements along with the present LEU fuel elements is being considered. The proposal calls for the HEU fuel elements to be placed near the thermal column to gain the required excess reactivity. In the present study the safety analysis of a proposed mixed-fuel core has been carried out at a calculated steady-state power level of 9.8 MW. Standard computer codes and correlations were employed to compute various parameters. Initiating events in reactivity-induced accidents involve various modes of reactivity insertion, namely, start-up accident, accidental drop of a fuel element on the core, flooding of a beam tube with water, and removal of an in-pile experiment during reactor operation. For each of these transients, time histories of reactor power, energy released, temperature, and reactivity were determined.

  11. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines

    PubMed Central

    Baka, Aikaterini D.; Uzunoglu, Nikolaos K.

    2014-01-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  12. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines.

    PubMed

    Baka, Aikaterini D; Uzunoglu, Nikolaos K

    2014-09-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  13. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  14. Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.

    SciTech Connect

    Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

    2002-05-01

    This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

  15. Experimental assessment of computer codes used for safety analysis of integral reactors

    SciTech Connect

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B.

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  16. A Petaflops Era Computing Analysis

    NASA Technical Reports Server (NTRS)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  17. Computer analysis of cardiovascular parameters.

    PubMed

    Mass, H J; Gean, J T; Gwirtz, P A

    1987-01-01

    A computer program is described for the analysis of several cardiovascular parameters frequently measured or derived in the chronically instrumented dog model. Data are stored on magnetic tape and are subsequently analyzed with the Apple IIe microcomputer equipped with the ADALAB (Interactive Microware, Inc.) analog-to-digital convertor. Not limited to the chronically instrumented animal model, the program is capable of analyzing left ventricular pressure, three channels of regional myocardial segment length, coronary flow velocity as measured by the Doppler ultrasonic flow technique, and two channels of systemic arterial pressure. Derived data include: left ventricular dP/dtmax, left ventricular pressure-heart rate product, left ventricular ejection time, tension time index; percent segment length shortening and velocity of shortening, dL/dt(s)max, regional stroke work and power, duration of systole and diastole; mean coronary flow velocity, peak diastolic and systolic flow velocity, and true mean systemic arterial pressure. PMID:3581809

  18. Probabilistic analysis of accident precursors in the nuclear industry.

    PubMed

    Hulsmans, M; De Gelder, P

    2004-07-26

    Feedback of operating experience has always been an important issue in the nuclear industry. A probabilistic safety analysis (PSA) can be used as a tool to analyse how an operational event might have developed adversely in order to obtain a quantitative assessment of the safety significance of the event. This process is called PSA-based event analysis (PSAEA). A comprehensive set of PSAEA guidelines was developed by an international project. The main characteristics of this methodology are summarised. This approach to analyse incidents can be used to meet different objectives of utilities or nuclear regulators. The paper describes the main objectives and the experiences of the Belgian nuclear regulatory organisation AVN with the application of PSA-based event analysis. Some interesting aspects of the process of PSAEA are further developed and underlined. Several case studies are discussed and an overview of the obtained results is given. Finally, the interest of a broad and interactive forum on PSAEA is highlighted. PMID:15231351

  19. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 3: Appendixes C-H

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report contains the Appendices for the Analysis of Accident Sequences and Source Terms at Waste Treatment and Storage Facilities for Waste Generated by the U.S. Department of Energy Waste Management Operations. The main report documents the methodology, computational framework, and results of facility accident analyses performed as a part of the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  20. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Gregg L. Sharp; R. T. McCracken

    2003-06-01

    The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  1. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Sharp, G.L.; McCracken, R.T.

    2003-05-13

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  2. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  3. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation. PMID:19819365

  4. Reconstruction of the 1994 Pittsburgh Airplane Accident Using a Computer Simulation

    NASA Technical Reports Server (NTRS)

    Parks, Edwin K.; Bach, Ralph E., Jr.; Shin, Jae Ho

    1998-01-01

    On September 8, 1994, a Boeing 737-300 passenger airplane was on a downwind approach to the Pittsburgh International Airport at an altitude of 5000 feet above ground level (6000 feet MSL). While in a shallow left turn onto a downwind approach heading, the airplane crossed into the vortex trail of a Boeing 727 flying in the same approach pattern about 4 miles ahead. The B-737 airplane rolled and turned sharply to the left, exited the vortex wake and plunged into the ground. Weather was not a factor in the accident. The airplane was equipped with a 11+ channel digital Flight Data Recorder (FDR) and a multiple channel Cockpit Voice Recorder (CVR). Both recorders were recovered from the crash site and provided excellent data for the development of an accident scenario. Radar tracking of the two airplanes as well as the indicated air speed (IAS) perturbations clearly visible on the B-737 FDR recordings indicate that the upset was apparently initiated by the airplane's crossing into the wake of the B-727 flying ahead in the same traffic pattern. A 6 degree-of-freedom simulation program for the B-737 airplane using MATLAB and SIMULINK was constructed. The simulation was initialized at the stabilized flight conditions of the airplane about 13 seconds prior to its entry into the vortex trail of the B-727 airplane. By assuming a certain combination of control inputs, it was possible to produce a simulated motion that closely matched that recorded on the FDR.

  5. Coupled thermal analysis applied to the study of the rod ejection accident

    SciTech Connect

    Gonnet, M.

    2012-07-01

    An advanced methodology for the assessment of fuel-rod thermal margins under RIA conditions has been developed by AREVA NP SAS. With the emergence of RIA analytical criteria, the study of the Rod Ejection Accident (REA) would normally require the analysis of each fuel rod, slice by slice, over the whole core. Up to now the strategy used to overcome this difficulty has been to perform separate analyses of sampled fuel pins with conservative hypotheses for thermal properties and boundary conditions. In the advanced methodology, the evaluation model for the Rod Ejection Accident (REA) integrates the node average fuel and coolant properties calculation for neutron feedback purpose as well as the peak fuel and coolant time-dependent properties for criteria checking. The calculation grid for peak fuel and coolant properties can be specified from the assembly pitch down to the cell pitch. The comparative analysis of methodologies shows that coupled methodology allows reducing excessive conservatism of the uncoupled approach. (authors)

  6. Analysis of the FeCrAl Accident Tolerant Fuel Concept Benefits during BWR Station Blackout Accidents

    SciTech Connect

    Robb, Kevin R

    2015-01-01

    Iron-chromium-aluminum (FeCrAl) alloys are being considered for fuel concepts with enhanced accident tolerance. FeCrAl alloys have very slow oxidation kinetics and good strength at high temperatures. FeCrAl could be used for fuel cladding in light water reactors and/or as channel box material in boiling water reactors (BWRs). To estimate the potential safety gains afforded by the FeCrAl concept, the MELCOR code was used to analyze a range of postulated station blackout severe accident scenarios in a BWR/4 reactor employing FeCrAl. The simulations utilize the most recently known thermophysical properties and oxidation kinetics for FeCrAl. Overall, when compared to the traditional Zircaloy-based cladding and channel box, the FeCrAl concept provides a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. Finally, due to the slower oxidation kinetics, substantially less hydrogen is generated, and the generation is delayed in time. This decreases the amount of non-condensable gases in containment and the potential for deflagrations to inhibit the accident response.

  7. Computer graphics in aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1984-01-01

    The use of computer graphics and its application to aerodynamic analyses on a routine basis is outlined. The mathematical modelling of the aircraft geometries and the shading technique implemented are discussed. Examples of computer graphics used to display aerodynamic flow field data and aircraft geometries are shown. A future need in computer graphics for aerodynamic analyses is addressed.

  8. Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.

    PubMed

    Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

    2013-10-01

    Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions

  9. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  10. An association between dietary habits and traffic accidents in patients with chronic liver disease: A data-mining analysis

    PubMed Central

    KAWAGUCHI, TAKUMI; SUETSUGU, TAKURO; OGATA, SHYOU; IMANAGA, MINAMI; ISHII, KUMIKO; ESAKI, NAO; SUGIMOTO, MASAKO; OTSUYAMA, JYURI; NAGAMATSU, AYU; TANIGUCHI, EITARO; ITOU, MINORU; ORIISHI, TETSUHARU; IWASAKI, SHOKO; MIURA, HIROKO; TORIMURA, TAKUJI

    2016-01-01

    The incidence of traffic accidents in patients with chronic liver disease (CLD) is high in the USA. However, the characteristics of patients, including dietary habits, differ between Japan and the USA. The present study investigated the incidence of traffic accidents in CLD patients and the clinical profiles associated with traffic accidents in Japan using a data-mining analysis. A cross-sectional study was performed and 256 subjects [148 CLD patients (CLD group) and 106 patients with other digestive diseases (disease control group)] were enrolled; 2 patients were excluded. The incidence of traffic accidents was compared between the two groups. Independent factors for traffic accidents were analyzed using logistic regression and decision-tree analyses. The incidence of traffic accidents did not differ between the CLD and disease control groups (8.8 vs. 11.3%). The results of the logistic regression analysis showed that yoghurt consumption was the only independent risk factor for traffic accidents (odds ratio, 0.37; 95% confidence interval, 0.16–0.85; P=0.0197). Similarly, the results of the decision-tree analysis showed that yoghurt consumption was the initial divergence variable. In patients who consumed yoghurt habitually, the incidence of traffic accidents was 6.6%, while that in patients who did not consume yoghurt was 16.0%. CLD was not identified as an independent factor in the logistic regression and decision-tree analyses. In conclusion, the difference in the incidence of traffic accidents in Japan between the CLD and disease control groups was insignificant. Furthermore, yoghurt consumption was an independent negative risk factor for traffic accidents in patients with digestive diseases, including CLD. PMID:27123257

  11. Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results

    SciTech Connect

    LAVENDER, J.C.

    2000-10-17

    RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

  12. Space Shuttle Columbia Post-Accident Analysis and Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

  13. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  14. Grid computing in image analysis

    PubMed Central

    2011-01-01

    Diagnostic surgical pathology or tissue–based diagnosis still remains the most reliable and specific diagnostic medical procedure. The development of whole slide scanners permits the creation of virtual slides and to work on so-called virtual microscopes. In addition to interactive work on virtual slides approaches have been reported that introduce automated virtual microscopy, which is composed of several tools focusing on quite different tasks. These include evaluation of image quality and image standardization, analysis of potential useful thresholds for object detection and identification (segmentation), dynamic segmentation procedures, adjustable magnification to optimize feature extraction, and texture analysis including image transformation and evaluation of elementary primitives. Grid technology seems to possess all features to efficiently target and control the specific tasks of image information and detection in order to obtain a detailed and accurate diagnosis. Grid technology is based upon so-called nodes that are linked together and share certain communication rules in using open standards. Their number and functionality can vary according to the needs of a specific user at a given point in time. When implementing automated virtual microscopy with Grid technology, all of the five different Grid functions have to be taken into account, namely 1) computation services, 2) data services, 3) application services, 4) information services, and 5) knowledge services. Although all mandatory tools of automated virtual microscopy can be implemented in a closed or standardized open system, Grid technology offers a new dimension to acquire, detect, classify, and distribute medical image information, and to assure quality in tissue–based diagnosis. PMID:21516880

  15. Speech analysis as an index of alcohol intoxication--the Exxon Valdez accident.

    PubMed

    Brenner, M; Cash, J R

    1991-09-01

    As part of its investigation of the EXXON VALDEZ tankship accident and oil spill, the National Transportation Safety Board (NTSB) examined the master's speech for alcohol-related effects. Recorded speech samples were obtained from marine radio communications tapes. The samples were tested for four effects associated with alcohol consumption is available scientific literature: slowed speech, speech errors, misarticulation of difficult sounds ("slurring"), and audible changes in speech quality. It was found that speech immediately before and after the accident displayed large changes of the sort associated with alcohol consumption. These changes were not readily explained by fatigue, psychological stress, drug effects, or medical problems. Speech analysis appears to be a useful technique to provide secondary evidence of alcohol impairment. PMID:1930083

  16. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  17. Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents

    NASA Astrophysics Data System (ADS)

    Minato, Kazuo

    1991-11-01

    In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

  18. Preliminary analysis of graphite dust releasing behavior in accident for HTR

    SciTech Connect

    Peng, W.; Yang, X. Y.; Yu, S. Y.; Wang, J.

    2012-07-01

    The behavior of the graphite dust is important to the safety of High Temperature Gas-cooled Reactors. This study investigated the flow of graphite dust in helium mainstream. The analysis of the stresses acting on the graphite dust indicated that gas drag played the absolute leading role. Based on the understanding of the importance of gas drag, an experimental system is set up for the research of dust releasing behavior in accident. Air driven by centrifugal fan is used as the working fluid instead of helium because helium is expensive, easy to leak which make it difficult to seal. The graphite particles, with the size distribution same as in HTR, are added to the experiment loop. The graphite dust releasing behavior at the loss-of-coolant accident will be investigated by a sonic nozzle. (authors)

  19. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  20. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  1. Safety culture and accident analysis--a socio-management approach based on organizational safety social capital.

    PubMed

    Rao, Suman

    2007-04-11

    One of the biggest challenges for organizations in today's competitive business environment is to create and preserve a self-sustaining safety culture. Typically, the key drivers of safety culture in many organizations are regulation, audits, safety training, various types of employee exhortations to comply with safety norms, etc. However, less evident factors like networking relationships and social trust amongst employees, as also extended networking relationships and social trust of organizations with external stakeholders like government, suppliers, regulators, etc., which constitute the safety social capital in the Organization--seem to also influence the sustenance of organizational safety culture. Can erosion in safety social capital cause deterioration in safety culture and contribute to accidents? If so, how does it contribute? As existing accident analysis models do not provide answers to these questions, CAMSoC (Curtailing Accidents by Managing Social Capital), an accident analysis model, is proposed. As an illustration, five accidents: Bhopal (India), Hyatt Regency (USA), Tenerife (Canary Islands), Westray (Canada) and Exxon Valdez (USA) have been analyzed using CAMSoC. This limited cross-industry analysis provides two key socio-management insights: the biggest source of motivation that causes deviant behavior leading to accidents is 'Faulty Value Systems'. The second biggest source is 'Enforceable Trust'. From a management control perspective, deterioration in safety culture and resultant accidents is more due to the 'action controls' rather than explicit 'cultural controls'. Future research directions to enhance the model's utility through layering are addressed briefly. PMID:16911855

  2. Analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.

    PubMed

    Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

    2015-05-01

    The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

  3. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  4. Bimolecular dynamics by computer analysis

    SciTech Connect

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  5. Calculation notes in support of TWRS FSAR spray leak accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document contains the detailed calculations that support the spray leak accident analysis in the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The consequence analyses in this document form the basis for the selection of controls to mitigate or prevent spray leaks throughout TWRS. Pressurized spray leaks can occur due to a breach in containment barriers along transfer routes, during waste transfers. Spray leaks are of particular safety concern because, depending on leak dimensions, and waste pressure, they can be relatively efficient generators of dispersible sized aerosols that can transport downwind to onsite and offsite receptors. Waste is transferred between storage tanks and between processing facilities and storage tanks in TWRS through a system of buried transfer lines. Pumps for transferring waste and jumpers and valves for rerouting waste are located inside below grade pits and structures that are normally covered. Pressurized spray leaks can emanate to the atmosphere due to breaches in waste transfer associated equipment inside these structures should the structures be uncovered at the time of the leak. Pressurized spray leaks can develop through holes or cracks in transfer piping, valve bodies or pump casings caused by such mechanisms as corrosion, erosion, thermal stress, or water hammer. Leaks through degraded valve packing, jumper gaskets, or pump seals can also result in pressurized spray releases. Mechanisms that can degrade seals, packing and gaskets include aging, radiation hardening, thermal stress, etc. An1782other common cause for spray leaks inside transfer enclosures are misaligned jumpers caused by human error. A spray leak inside a DST valve pit during a transfer of aging waste was selected as the bounding, representative accident for detailed analysis. Sections 2 through 5 below develop this representative accident using the DOE- STD-3009 format. Sections 2 describes the unmitigated and mitigated accident

  6. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    PubMed

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error." PMID:27085591

  7. Analysis on the Density Driven Air-Ingress Accident in VHTRs

    SciTech Connect

    Eung Soo Kim; Chang Oh; Richard Schultz; David Petti

    2008-11-01

    Air-ingress following the pipe rupture is considered to be the most serious accident in the VHTRs due to its potential problems such as core heat-up, structural integrity and toxic gas release. Previously, it has been believed that the main air-ingress mechanism of this accident is the molecular diffusion process between the reactor core and the cavity. However, according to some recent studies, there is another fast air-ingress process that has not been considered before. It is called density-driven stratified flow. The potential for density-driven stratified air ingress into the VHTR following a large-break LOCA was first described in the NGNP Methods Technical Program based on stratified flow studies performed with liquid. Studies on densitygradient driven stratified flow in advanced reactor systems has been the subject of active research for well over a decade since density-gradient dominated stratified flow is an inherent characteristic of passive systems used in advanced reactors. Recently, Oh et al. performed a CFD analysis on the stratified flow in the VHTR, and showed that this effect can significantly accelerate the air-ingress process in the VHTRs. They also proposed to replace the original air-ingress scenario based on the molecular diffusion with the one based on the stratified flow. This paper is focusing on the effect of stratified flow on the results of the air-ingress accident in VHTR

  8. Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)

    SciTech Connect

    Johnson, E.W.

    1988-10-01

    The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  9. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  10. Analysis of National Major Work Safety Accidents in China, 2003–2012

    PubMed Central

    YE, Yunfeng; ZHANG, Siheng; RAO, Jiaming; WANG, Haiqing; LI, Yang; WANG, Shengyong; DONG, Xiaomei

    2016-01-01

    Background: This study provides a national profile of major work safety accidents in China, which cause more than 10 fatalities per accident, intended to provide scientific basis for prevention measures and strategies to reduce major work safety accidents and deaths. Methods: Data from 2003–2012 Census of major work safety accidents were collected from State Administration of Work Safety System (SAWS). Published literature and statistical yearbook were also included to implement information. We analyzed the frequency of accidents and deaths, trend, geographic distribution and injury types. Additionally, we discussed the severity and urgency of emergency rescue by types of accidents. Results: A total of 877 major work safety accidents were reported, resulting in 16,795 deaths and 9,183 injuries. The numbers of accidents and deaths, mortality rate and incidence of major accidents have declined in recent years. The mortality rate and incidence was 0.71 and 1.20 per 106 populations in 2012, respectively. Transportation and mining contributed to the highest number of major accidents and deaths. Major aviation and railway accidents caused more casualties per incident, while collapse, machinery, electrical shock accidents and tailing dam accidents were the most severe situation that resulted in bigger proportion of death. Conclusion: Ten years’ major work safety accident data indicate that the frequency of accidents and number of eaths was declined and several safety concerns persist in some segments. PMID:27057515

  11. Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident

    SciTech Connect

    Aldrich, D C; Blond, R M

    1980-01-01

    An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

  12. Little impact of tsunami-stricken nuclear accident on awareness of radiation dose of cardiac computed tomography: A questionnaire study

    PubMed Central

    2013-01-01

    Background With the increased use of cardiac computed tomography (CT), radiation dose remains a major issue, although physicians are trying to reduce the substantial risks associated with use of this diagnostic tool. This study was performed to investigate recognition of the level of radiation exposure from cardiac CT and the differences in the level of awareness of radiation before and after the Fukushima nuclear plant accident. Methods We asked 30 physicians who were undergoing training in internal medicine to determine the equivalent doses of radiation for common radiological examinations when a normal chest X-ray is accepted as one unit; questions about the absolute radiation dose of cardiac CT data were also asked. Results According to the results, 86.6% of respondents believed the exposure to be 1 mSv at most, and 93.3% thought that the exposure was less than that of 100 chest X-rays. This finding indicates that their perceptions were far lower than the actual amounts. Even after the occurrence of such a large nuclear disaster in Fukushima, there were no significant differences in the same subjects’ overall awareness of radiation amounts. Conclusions Even after such a major social issue as the Fukushima nuclear accident, the level of awareness of the accurate radiation amount used in 64-channel multidetector CT (MDCT) by clinical physicians who order this test was not satisfactory. Thus, there is a need for the development of effective continuing education programs to improve awareness of radiation from ionizing radiation devices, including cardiac CT, and emphasis on risk-benefit evaluation based on accurate knowledge during medical training. PMID:23631688

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    SciTech Connect

    Johnson, E.W.

    1985-10-01

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  15. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  16. Analysis of accident sequences and source terms at waste treatment and storage facilities for waste generated by U.S. Department of Energy Waste Management Operations, Volume 1: Sections 1-9

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.

    1995-04-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the U.S. Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies are assessed, and the resultant radiological and chemical source terms are evaluated. A personal computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for calculation of human health risk impacts. The methodology is in compliance with the most recent guidance from DOE. It considers the spectrum of accident sequences that could occur in activities covered by the WM PEIS and uses a graded approach emphasizing the risk-dominant scenarios to facilitate discrimination among the various WM PEIS alternatives. Although it allows reasonable estimates of the risk impacts associated with each alternative, the main goal of the accident analysis methodology is to allow reliable estimates of the relative risks among the alternatives. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also provide discussion of specific accident analysis data and guidance used or consulted in this report.

  17. Hypothetical accident condition thermal analysis and testing of a Type B drum package

    SciTech Connect

    Hensel, S.J.; Alstine, M.N. Van; Gromada, R.J.

    1995-07-01

    A thermophysical property model developed to analytically determine the thermal response of cane fiberboard when exposed to temperatures and heat fluxes associated with the 10 CFR 71 hypothetical accident condition (HAC) has been benchmarked against two Type B drum package fire test results. The model 9973 package was fire tested after a 30 ft. top down drop and puncture, and an undamaged model 9975 package containing a heater (21W) was fire tested to determine content heat source effects. Analysis results using a refined version of a previously developed HAC fiberboard model compared well against the test data from both the 9973 and 9975 packages.

  18. Accident sequence analysis for a BWR (Boiling Water Reactor) during low power and shutdown operations

    SciTech Connect

    Whitehead, D.W.; Hake, T.M.

    1990-01-01

    Most previous Probabilistic Risk Assessments have excluded consideration of accidents initiated in low power and shutdown modes of operation. A study of the risk associated with operation in low power and shutdown is being performed at Sandia National Laboratories for a US Boiling Water Reactor (BWR). This paper describes the proposed methodology for the analysis of the risk associated with the operation of a BWR during low power and shutdown modes and presents preliminary information resulting from the application of the methodology. 2 refs., 2 tabs.

  19. Modeling of BWR core meltdown accidents - for application in the MELRPI. MOD2 computer code

    SciTech Connect

    Koh, B R; Kim, S H; Taleyarkhan, R P; Podowski, M Z; Lahey, Jr, R T

    1985-04-01

    This report summarizes improvements and modifications made in the MELRPI computer code. A major difference between this new, updated version of the code, called MELRPI.MOD2, and the one reported previously, concerns the inclusion of a model for the BWR emergency core cooling systems (ECCS). This model and its computer implementation, the ECCRPI subroutine, account for various emergency injection modes, for both intact and rubblized geometries. Other changes to MELRPI deal with an improved model for canister wall oxidation, rubble bed modeling, and numerical integration of system equations. A complete documentation of the entire MELRPI.MOD2 code is also given, including an input guide, list of subroutines, sample input/output and program listing.

  20. Creation of Computational Benchmarks for LEU and MOX Fuel Assemblies Under Accident Conditions

    SciTech Connect

    Pavlovitchev, A M; Kalashnikov, A G; Kalugin, M A; Lazarenko, A P; Maiorov, L V; Sidorenko, V D

    1999-11-01

    The result of VVER-1000 computational benchmarks, calculations obtained with the use of various Russian codes (such as MCU-RFFI/A, TVS-M and WIMS-ABBN) are presented. List of benchmarks includes LEU and MOX cells with fresh and spent fuel under various conditions (for calculation of kinetic parameters, Doppler coefficient, reactivity effect of decreasing the water density). Calculations results are compared with each other and results of this comparison are discussed.

  1. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    SciTech Connect

    Pasichnyk, I.; Perin, Y.; Velkov, K.

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  2. Analysis of Maximum Reasonably Foreseeable Accidents for the Yucca Mountain Draft Environmental Impact Statement (DEIS)

    SciTech Connect

    S.B. Ross; R.E. Best; S.J. Maheras; T.I. McSweeney

    2001-08-17

    Accidents could occur during the transportation of spent nuclear fuel and high-level radioactive waste. This paper describes the risks and consequences to the public from accidents that are highly unlikely but that could have severe consequences. The impact of these accidents would include those to a collective population and to hypothetical maximally exposed individuals (MEIs). This document discusses accidents with conditions that have a chance of occurring more often than 1 in 10 million times in a year, called ''maximum reasonably foreseeable accidents''. Accidents and conditions less likely than this are not considered to be reasonably foreseeable.

  3. A systemic approach to accident analysis: a case study of the Stockwell shooting.

    PubMed

    Jenkins, Daniel P; Salmon, Paul M; Stanton, Neville A; Walker, Guy H

    2010-01-01

    This paper uses a systemic approach to accident investigation, based upon AcciMaps, to model the events leading up to the shooting of Jean Charles de Menezes at Stockwell Underground station in July 2005. The model captures many of the findings of the Independent Police Complaints Commission's report in a single representation, modelling their interdependencies and the causal flow. Furthermore, by taking a systemic approach, the analysis identifies further considerations related to the suitability of the Metropolitan Police Service's organisational structure to support rapid-paced operations, where reliable identification of a suspect is not possible. Based upon the analysis, the paper questions the division of functions between teams and the suitability of an organisational structure that relies upon the complex flow of information between separate teams for surveillance and for controlling the suspect. A dynamic organisational structure is proposed that changes in response to operation type and unfolding events. STATEMENT OF RELEVANCE: This paper provides much needed and called for validation for a systemic approach to accident analysis. A widely reported case study is used to illustrate the process. The paper shows how such an approach can consolidate the key findings of much larger reports as well as draw out additional recommendations. PMID:20069477

  4. Computer applications for engineering/structural analysis

    NASA Astrophysics Data System (ADS)

    Zaslawsky, M.; Samaddar, S. K.

    1991-10-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequence of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  5. Computer applications for engineering/structural analysis

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-01-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  6. Bicycle accidents often cause disability--an analysis of medical and social consequences of nonfatal bicycle accidents.

    PubMed

    Olkkonen, S; Lahdenranta, U; Slätis, P; Honkanen, R

    1993-06-01

    Social and medical consequences of 278 children and 264 adults injured in bicycle accidents and seen in two hospitals in Helsinki in 1985-86 were analyzed. Information was collected from patient records, by means of a special questionnaire and by telephone interview. A child outpatient required 1.7 and a child inpatient 3.0 physician visits on an average, while adults required 2.2 and 4.9 visits, respectively. The average duration of hospital stay was 8 days for hospitalized adults and 6 days for children. Rehabilitative care outside the hospital was received by 6% of the adult outpatients and 25% of the inpatients, but none of the injured children. The mean duration of work disability was 82 days among inpatients, 11 days among outpatients, 127 days among the inpatients injured in motor vehicle collisions and 65 days among inpatients injured in other bicycle accidents. Of inpatients 32% and of outpatients 5% reported persistent (> 6 months) disability. Persistent disability was recorded in 11% of children, in 47% of adults and in 67% of elderly inpatients. Most serious consequences were due to intracranial injuries in motor vehicle-bicycle collisions. Of the hospitalized bicyclists 4% suffered from severe cognitive and behavioural changes or sense impairment and of adult inpatients 3% suffered from permanent work disability. The average costs of health and social services were about FIM 1000 per adult outpatient and FIM 13000 per adult inpatient. In prevention high priority should be given to motor vehicle collisions, head injuries and injuries among the elderly bicyclists. PMID:8367689

  7. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a

  8. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  9. Computer Programming in a Spatial Analysis Course.

    ERIC Educational Resources Information Center

    Gesler, Wilbert; Kaplan, Abram

    1993-01-01

    Contends that students in spatial analysis courses generally are familiar with computer use and programs but lack basic computer programing skills. Describes four exercises in which students learn programing using BASIC and dBASE. Asserts that programming exercises help students clarify concepts, understand the rationale behind calculations, use…

  10. Discourse Analysis of Teaching Computing Online

    ERIC Educational Resources Information Center

    Bower, Matt

    2009-01-01

    This paper analyses the teaching and learning of computing in a Web-conferencing environment. A discourse analysis of three introductory programming learning episodes is presented to demonstrate issues and effects that arise when teaching computing using such an approach. The subject of discussion, the interactive nature of discussion and any…

  11. A Computer Language for ECG Contour Analysis

    PubMed Central

    McConnochie, John W.

    1982-01-01

    The purpose of this paper is to demonstrate contructively that criteria for ECG contour analysis can be interpreted directly by a computer. Thereby, the programming task is greatly reduced. Direct interpretation is achieved by the creation of a computer language that is well-suited for the expression of such criteria. Further development of the language is planned.

  12. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. PMID:21094332

  13. Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR

    SciTech Connect

    Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T.; Shirakawa, N.

    2012-07-01

    The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

  14. Analysis of Occupational Accident Fatalities and Injuries Among Male Group in Iran Between 2008 and 2012

    PubMed Central

    Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi

    2015-01-01

    Background: Because of occupational accidents, permanent disabilities and deaths occur and economic and workday losses emerge. Objectives: The purpose of the present study was to investigate the factors responsible for occupational accidents occurred in Iran. Patients and Methods: The current study analyzed 1464 occupational accidents recorded by the Ministry of Labor and Social Affairs’ offices in Iran during 2008 - 2012. At first, general understanding of accidents was obtained using descriptive statistics. Afterwards, the chi-square test and Cramer’s V statistic (Vc) were used to determine the association between factors influencing the type of injury as occupational accident outcomes. Results: There was no significant association between marital status and time of day with the type of injury. However, activity sector, cause of accident, victim’s education, age of victim and victim’s experience were significantly associated with the type of injury. Conclusions: Successful accident prevention relies largely on knowledge about the causes of accidents. In any accident control activity, particularly in occupational accidents, correctly identifying high-risk groups and factors influencing accidents is the key to successful interventions. Results of this study can cause to increase accident awareness and enable workplace’s management to select and prioritize problem areas and safety system weakness in workplaces. PMID:26568848

  15. A Content Analysis of News Media Coverage of the Accident at Three Mile Island.

    ERIC Educational Resources Information Center

    Stephens, Mitchell; Edison, Nadyne G.

    A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

  16. Computer aided analysis of phonocardiogram.

    PubMed

    Singh, J; Anand, R S

    2007-01-01

    In the present paper analysis of phonocardiogram (PCG) records are presented. The analysis has been carried out in both time and frequency domains with the aim of detecting certain correlations between the time and frequency domain representations of PCG. The analysis is limited to first and second heart sounds (S1 and S2) only. In the time domain analysis the moving window averaging technique is used to determine the occurrence of S1 and S2, which helps in determination of cardiac interval and absolute and relative time duration of individual S1 and S2, as well as absolute and relative duration between them. In the frequency domain, fast Fourier transform (FFT) of the complete PCG record, and short time Fourier transform (STFT) and wavelet transform of individual heart sounds have been carried out. The frequency domain analysis gives an idea about the dominant frequency components in individual records and frequency spectrum of individual heart sounds. A comparative observation on both the analyses gives some correlation between time domain and frequency domain representations of PCG. PMID:17701776

  17. Radiological Safety Analysis Computer Program

    2001-08-28

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface andmore » plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FOR 11 and 12.« less

  18. Effects of improved modeling on best estimate BWR severe accident analysis

    SciTech Connect

    Hyman, C.R.; Ott, L.J.

    1984-01-01

    Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H/sub 2/O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B/sub 4/C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table.

  19. Distributed computing and nuclear reactor analysis

    SciTech Connect

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-03-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

  20. Narrative text analysis of accident reports with tractors, self-propelled harvesting machinery and materials handling machinery in Austrian agriculture from 2008 to 2010 - a comparison.

    PubMed

    Mayrhofer, Hannes; Quendler, Elisabeth; Boxberger, Josef

    2014-01-01

    The aim of this study was the identification of accident scenarios and causes by analysing existing accident reports of recognized agricultural occupational accidents with tractors, self-propelled harvesting machinery and materials handling machinery from 2008 to 2010. As a result of a literature-based evaluation of past accident analyses, the narrative text analysis was chosen as an appropriate method. A narrative analysis of the text fields of accident reports that farmers used to report accidents to insurers was conducted to obtain detailed information about the scenarios and causes of accidents. This narrative analysis of reports was made the first time and yielded first insights for identifying antecedents of accidents and potential opportunities for technical based intervention. A literature and internet search was done to discuss and confirm the findings. The narrative text analysis showed that in more than one third of the accidents with tractors and materials handling machinery the vehicle rolled or tipped over. The most relevant accident scenarios with harvesting machinery were being trapped and falling down. The direct comparison of the analysed machinery categories showed that more than 10% of the accidents in each category were caused by technical faults, slippery or muddy terrain and incorrect or inappropriate operation of the vehicle. Accidents with tractors, harvesting machinery and materials handling machinery showed similarities in terms of causes, circumstances and consequences. Certain technical and communicative measures for accident prevention could be used for all three machinery categories. Nevertheless, some individual solutions for accident prevention, which suit each specific machine type, would be necessary. PMID:24738521

  1. Computer aided cogeneration feasibility analysis

    SciTech Connect

    Anaya, D.A.; Caltenco, E.J.L.; Robles, L.F.

    1996-12-31

    A successful cogeneration system design depends of several factors, and the optimal configuration can be founded using a steam and power simulation software. The key characteristics of one of this kind of software are described below, and its application on a process plant cogeneration feasibility analysis is shown in this paper. Finally a study case is illustrated. 4 refs., 2 figs.

  2. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  3. Analysis of Kuosheng Large-Break Loss-of-Coolant Accident with MELCOR 1.8.4

    SciTech Connect

    Wang, T.-C.; Wang, S.-J.; Chien, C.-S

    2000-09-15

    The MELCOR code, developed by Sandia National Laboratories, is capable of simulating the severe accident phenomena of light water reactor nuclear power plants (NPPs). A specific large-break loss-of-coolant accident (LOCA) for Kuosheng NPP is simulated with the use of the MELCOR 1.8.4 code. This accident is induced by a double-ended guillotine break of one of the recirculation pipes concurrent with complete failure of the emergency core cooling system. The MELCOR input deck for the Kuosheng NPP is established based on the design data of the Kuosheng NPP and the MELCOR users' guides. The initial steady-state conditions are generated with a developed self-initialization algorithm. The effect of the MELCOR 1.8.4-provided initialization process is demonstrated. The main severe accident phenomena and the corresponding fission product released fractions associated with the large-break LOCA sequences are simulated. The MELCOR 1.8.4 predicts a longer time interval between the core collapse and vessel failure and a higher source term. This MELCOR 1.8.4 input deck will be applied to the probabilistic risk assessment, the severe accident analysis, and the severe accident management study of the Kuosheng NPP in the near future.

  4. Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report

    SciTech Connect

    Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

    1986-09-01

    The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

  5. Analysis of station blackout accidents for the Bellefonte pressurized water reactor

    SciTech Connect

    Gasser, R D; Bieniarz, P P; Tills, J L

    1986-09-01

    An analysis has been performed for the Bellefonte PWR Unit 1 to determine the containment loading and the radiological releases into the environment from a station blackout accident. A number of issues have been addressed in this analysis which include the effects of direct heating on containment loading, and the effects of fission product heating and natural convection on releases from the primary system. The results indicate that direct heating which involves more than about 50% of the core can fail the Bellefonte containment, but natural convection in the RCS may lead to overheating and failure of the primary system piping before core slump, thus, eliminating or mitigating direct heating. Releases from the primary system are significantly increased before vessel breach due to natural circulation and after vessel breach due to reevolution of retained fission products by fission product heating of RCS structures.

  6. Analysis of the SL-1 Accident Using RELAPS5-3D

    SciTech Connect

    Francisco, A.D. and Tomlinson, E. T.

    2007-11-08

    On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

  7. Model of the reliability analysis of the distributed computer systems with architecture "client-server"

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Zelenkov, P. V.; Karaseva, M. V.; Tsarev, M. Yu; Tsarev, R. Yu

    2015-01-01

    The paper considers the problem of the analysis of distributed computer systems reliability with client-server architecture. A distributed computer system is a set of hardware and software for implementing the following main functions: processing, storage, transmission and data protection. This paper discusses the distributed computer systems architecture "client-server". The paper presents the scheme of the distributed computer system functioning represented as a graph where vertices are the functional state of the system and arcs are transitions from one state to another depending on the prevailing conditions. In reliability analysis we consider such reliability indicators as the probability of the system transition in the stopping state and accidents, as well as the intensity of these transitions. The proposed model allows us to obtain correlations for the reliability parameters of the distributed computer system without any assumptions about the distribution laws of random variables and the elements number in the system.

  8. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS

    SciTech Connect

    Wu, T

    2008-04-30

    Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

  10. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  11. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  12. Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors

    SciTech Connect

    Pate-Cornell, M.E. )

    1993-04-01

    The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

  13. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  14. Review of the status of validation of the computer codes used in the severe accident source term reassessment study (BMI-2104). [PWR; BWR

    SciTech Connect

    Kress, T. S.

    1985-04-01

    The determination of severe accident source terms must, by necessity it seems, rely heavily on the use of complex computer codes. Source term acceptability, therefore, rests on the assessed validity of such codes. Consequently, one element of NRC's recent efforts to reassess LWR severe accident source terms is to provide a review of the status of validation of the computer codes used in the reassessment. The results of this review is the subject of this document. The separate review documents compiled in this report were used as a resource along with the results of the BMI-2104 study by BCL and the QUEST study by SNL to arrive at a more-or-less independent appraisal of the status of source term modeling at this time.

  15. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  16. Use of Some "Discriminant Analysis" Computer Programs

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    1977-01-01

    The objective of this paper is to review the outputs of selected computer programs often used to carry out a "discriminant analysis" with respect to two purposes of such an analysis, discrimination and classification. The programs selected are three BMD programs. (Author/JKS)

  17. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies. PMID:22872081

  18. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    PubMed

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2015-01-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended. PMID:25179119

  19. Automating sensitivity analysis of computer models using computer calculus

    SciTech Connect

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs.

  20. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  1. Computer aided nonlinear electrical networks analysis

    NASA Technical Reports Server (NTRS)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  2. Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor

    SciTech Connect

    Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

    1992-10-01

    This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

  3. Multifractal analysis of the 137Cs fallout pattern in Austria resulting from the Chernobyl accident.

    PubMed

    Pausch, G; Bossew, P; Hofmann, W; Steger, F

    1998-06-01

    The cumulative deposition of the 137Cs fallout in Austria resulting from the passage of the Chernobyl cloud has been investigated by applying correlation dimension and hyperbolic frequency distribution methods. For the analysis, a total of 1,881 deposition values were used, which were collected by the Federal Environmental Agency of Austria and the Federal Ministry of Health, representing all available measurements of 137Cs in soil made in Austria after the Chernobyl accident. From these data a hyperbolic exponent for the frequency distribution of 4.0 and a set of fractal correlation dimensions, which decrease from 1.426 +/- 0.022 (for the whole network) to 0.706 +/- 0.047 (for 137Cs values > or = 100 kBq m(-2)), were derived, thus confirming that the fallout pattern can be described as a multifractal. PMID:9600299

  4. PTSD Symptom Severity and Psychiatric Comorbidity in Recent Motor Vehicle Accident Victims: A Latent Class Analysis

    PubMed Central

    Hruska, Bryce; Irish, Leah A.; Pacella, Maria L.; Sledjeski, Eve M.; Delahanty, Douglas L.

    2014-01-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  5. PTSD symptom severity and psychiatric comorbidity in recent motor vehicle accident victims: a latent class analysis.

    PubMed

    Hruska, Bryce; Irish, Leah A; Pacella, Maria L; Sledjeski, Eve M; Delahanty, Douglas L

    2014-10-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  6. Visualization of Traffic Accidents

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Shen, Yuzhong; Khattak, Asad

    2010-01-01

    Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.

  7. Computer aided stress analysis of long bones utilizing computer tomography

    SciTech Connect

    Marom, S.A.

    1986-01-01

    A computer aided analysis method, utilizing computed tomography (CT) has been developed, which together with a finite element program determines the stress-displacement pattern in a long bone section. The CT data file provides the geometry, the density and the material properties for the generated finite element model. A three-dimensional finite element model of a tibial shaft is automatically generated from the CT file by a pre-processing procedure for a finite element program. The developed pre-processor includes an edge detection algorithm which determines the boundaries of the reconstructed cross-sectional images of the scanned bone. A mesh generation procedure than automatically generates a three-dimensional mesh of a user-selected refinement. The elastic properties needed for the stress analysis are individually determined for each model element using the radiographic density (CT number) of each pixel with the elemental borders. The elastic modulus is determined from the CT radiographic density by using an empirical relationship from the literature. The generated finite element model, together with applied loads, determined from existing gait analysis and initial displacements, comprise a formatted input for the SAP IV finite element program. The output of this program, stresses and displacements at the model elements and nodes, are sorted and displayed by a developed post-processor to provide maximum and minimum values at selected locations in the model.

  8. Computer analysis of foetal monitoring signals.

    PubMed

    Nunes, Inês; Ayres-de-Campos, Diogo

    2016-01-01

    Five systems for computer analysis of foetal monitoring signals are currently available, incorporating the evaluation of cardiotocographic (CTG) or combined CTG with electrocardiographic ST data. All systems have been integrated with central monitoring stations, allowing the simultaneous monitoring of several tracings on the same computer screen in multiple hospital locations. Computer analysis elicits real-time visual and sound alerts for health care professionals when abnormal patterns are detected, with the aim of prompting a re-evaluation and subsequent clinical action, if considered necessary. Comparison between the CTG analyses provided by the computer and clinical experts has been carried out in all systems, and in three of them, the accuracy of computer alerts in predicting newborn outcomes was evaluated. Comparisons between these studies are hampered by the differences in selection criteria and outcomes. Two of these systems have just completed multicentre randomised clinical trials comparing them with conventional CTG monitoring, and their results are awaited shortly. For the time being, there is limited evidence regarding the impact of computer analysis of foetal monitoring signals on perinatal indicators and on health care professionals' behaviour. PMID:26211832

  9. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    SciTech Connect

    Not Available

    1988-12-15

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

  10. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  11. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  12. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  13. A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains

    SciTech Connect

    Burgherr, P.; Hirschberg, S.

    2008-07-01

    This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

  14. Discrete computer analysis in petroleum geology

    SciTech Connect

    Zakharian, A.Z.

    1995-08-01

    Computer analysis must not be resembling on geologist`s work, having its own way because of uncertainty and shortness of geological information even on mature stage of exploration, when our original system of formal discrete computer analysis, realised on {open_quotes}FoxPro for Windows{close_quotes} with not substantial but probabilistic (without ever driving the usual maps) representation of geological situation was used for picking out the sets of best points for exploration drilling in south part of Dheprovsko-Donetzky oil-gas basin.

  15. Temporal fringe pattern analysis with parallel computing

    SciTech Connect

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-11-20

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis.

  16. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  17. Analysis of Sodium Fire in the Containment Building of Prototype Fast Breeder Reactor Under the Scenario of Core Disruptive Accident

    SciTech Connect

    Rao, P.M.; Kasinathan, N.; Kannan, S.E.

    2006-07-01

    The potential for sodium release to reactor containment building from reactor assembly during Core Disruptive Accident (CDA) in Fast Breeder Reactors (FBR) is an important safety issue with reference to the structural integrity of Reactor Containment Building (RCB). For Prototype Fast Breeder Reactor (PFBR), the estimated sodium release under a CDA of 100 MJ energy release is 350 kg. The ejected sodium reacts easily with air in RCB and causes temperature and pressure rise in the RCB. For estimating the severe thermal consequences in RCB, different modes of sodium fires like pool and spray fires were analyzed by using SOFIRE -- II and NACOM sodium fire computer codes. Effects of important parameters like amount of sodium, area of pool, containment air volume and oxygen concentration have been investigated. A peak pressure rise of 7.32 kPa is predicted by SOFIRE II code for 350 kg sodium pool fire in 86,000 m{sup 3} RCB volume. Under sodium release as spray followed by unburnt sodium as pool fire mode analysis, the estimated pressure rise is 5.85 kPa in the RCB. In the mode of instantaneous combustion of sodium, the estimated peak pressure rise is 13 kPa. (authors)

  18. Analysis of the Uniform Accident And Sickness Policy Provision Law: lessons for social work practice, policy, and research.

    PubMed

    Cochran, Gerald

    2010-01-01

    The Uniform Accident and Sickness Policy Provision Law (UPPL) is a state statute that allows insurance companies in 26 states to deny claims for accidents and injuries incurred by persons under the influence of drugs or alcohol. Serious repercussions can result for patients and health care professionals as states enforce this law. To examine differences within the laws that might facilitate amendments or reduce insurance companies' ability to deny claims, a content analysis was carried out of each state's UPPL law. Results showed no meaningful differences between each state's laws. These results indicate patients and health professionals share similar risk related to the UPPL regardless of state. PMID:20711944

  19. Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H

    SciTech Connect

    Blanchard, A.

    1999-05-10

    'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

  20. X ray computed tomography for failure analysis

    NASA Astrophysics Data System (ADS)

    Bossi, Richard H.; Crews, Alan R.; Georgeson, Gary E.

    1992-08-01

    Under a preliminary testing task assignment of the Advanced Development of X-Ray Computed Tomography Application program, computed tomography (CT) has been studied for its potential as a tool to assist in failure analysis investigations. CT provides three-dimensional spatial distribution of material that can be used to assess internal configurations and material conditions nondestructively. This capability has been used in failure analysis studies to determine the position of internal components and their operation. CT is particularly advantageous on complex systems, composite failure studies, and testing under operational or environmental conditions. CT plays an important role in reducing the time and effort of a failure analysis investigation. Aircraft manufacturing or logistical facilities perform failure analysis operations routinely and could be expected to reduce schedules, reduce costs and/or improve evaluation on about 10 to 30 percent of the problems they investigate by using CT.

  1. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident

    NASA Astrophysics Data System (ADS)

    Vazquez, Justin A.; Caracappa, Peter F.; Xu, X. George

    2014-09-01

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools.

  2. Computer based terrain analysis for operational planning

    SciTech Connect

    Powell, D.R.

    1987-01-01

    Analysis of operational capability is an ongoing task for military commanders. In peacetime, most analysis is conducted via computer based combat simulations, where selected force structures engage in simulated combat to gain insight into specific scenarios. The command and control (C/sup 2/) mechanisms that direct combat forces are often neglected relative to the fidelity of representation of mechanical and physical entities. C/sup 2/ capabilities should include the ability to plan a mission, monitor execution activities, and redirect combat power when appropriate. This paper discusses the development of a computer based approach to mission planning for land warfare. The aspect emphasized is the computation and representation of relevant terrain features in the context of operational planning.

  3. Final Report Computational Analysis of Dynamical Systems

    SciTech Connect

    Guckenheimer, John

    2012-05-08

    This is the final report for DOE Grant DE-FG02-93ER25164, initiated in 1993. This grant supported research of John Guckenheimer on computational analysis of dynamical systems. During that period, seventeen individuals received PhD degrees under the supervision of Guckenheimer and over fifty publications related to the grant were produced. This document contains copies of these publications.

  4. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  5. The Wheels of Misfortune: A Time Series Analysis of Bicycle Accidents on a College Campus.

    ERIC Educational Resources Information Center

    Johnson, Mark S.; And Others

    1978-01-01

    The effectiveness of engineering and policy interventions in reducing bicycle accidents on the campus of the University of California at Santa Barbara was investigated. None of the bikeway modifications were found to have been effective in reducing bicycle accidents. (Author/GDC)

  6. Computational analysis of a multistage axial compressor

    NASA Astrophysics Data System (ADS)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  7. 3W approach to the investigation, analysis, and prevention of human-error aircraft accidents.

    PubMed

    Ricketson, D S; Brown, W R; Graham, K N

    1980-09-01

    Human error is the largest cause of U.S. Army aircraft accidents. An approach to this problem is presented which is based on a model of the human-error accident. This 3W approach identifies what task error (TE) caused or contributed to the accident, what inadequacy (I) in the aviation system caused or allowed the TE to occur, and what remedial measure (R) is required to correct the I. There were 82 human-error accidents analyzed to identify TEIR information. Statistically important is were identified which could be remedied based on accident costs. Then, potentially cost-effective remedial actions were ranked on a cost-benefit totem pole. The totem pole was given to the aviation system manager as a management tool to assist in determining priorities for corrective actions. PMID:7417175

  8. Emergency drinking water treatment during source water pollution accidents in China: origin analysis, framework and technologies.

    PubMed

    Zhang, Xiao-Jian; Chen, Chao; Lin, Peng-Fei; Hou, Ai-Xin; Niu, Zhang-Bin; Wang, Jun

    2011-01-01

    China has suffered frequent source water contamination accidents in the past decade, which has resulted in severe consequences to the water supply of millions of residents. The origins of typical cases of contamination are discussed in this paper as well as the emergency response to these accidents. In general, excessive pursuit of rapid industrialization and the unreasonable location of factories are responsible for the increasing frequency of accidental pollution events. Moreover, insufficient attention to environmental protection and rudimentary emergency response capability has exacerbated the consequences of such accidents. These environmental accidents triggered or accelerated the promulgation of stricter environmental protection policy and the shift from economic development mode to a more sustainable direction, which should be regarded as the turning point of environmental protection in China. To guarantee water security, China is trying to establish a rapid and effective emergency response framework, build up the capability of early accident detection, and develop efficient technologies to remove contaminants from water. PMID:21133359

  9. ADAPT (Analysis of Dynamic Accident Progression Trees) Beta Version 0.9

    2010-01-07

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified simulator. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which use explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system evolution along with stochastic modeling. When DET are used to model different aspects of Probabilistic Risk Assessment (PRA),more » all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specified times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at different times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination) and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worse case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  10. MORECA: A computer code for simulating modular high-temperature gas-cooled reactor core heatup accidents

    SciTech Connect

    Ball, S.J. )

    1991-10-01

    The design features of the modular high-temperature gas-cooled reactor (MHTGR) have the potential to make it essentially invulnerable to damage from postulated core heatup accidents. This report describes the ORNL MORECA code, which was developed for analyzing postulated long-term core heatup scenarios for which active cooling systems used to remove afterheat following the accidents can be assumed to the unavailable. Simulations of long-term loss-of-forced-convection accidents, both with and without depressurization of the primary coolant, have shown that maximum core temperatures stay below the point at which any significant fuel failures and fission product releases are expected. Sensitivity studies also have been done to determine the effects of errors in the predictions due both to uncertainties in the modeling and to the assumptions about operational parameters. MORECA models the US Department of Energy reference design of a standard MHTGR.

  11. Hazard categorization and accident analysis techniques for compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports

    SciTech Connect

    1992-12-31

    The purpose of this DOE Standard is to establish guidance for facility managers and Program Secretarial Officers (PSOs) and thereby help them to comply consistently and more efficiently with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports. To this end, this guidance provides the following practical information: (1) The threshold quantities of radiological material inventory below which compliance with DOE Order 5480.23 is not required. (2) The level of effort to develop the program plan and schedule required in Section 9.b. (2) of the Order, and information for making a preliminary assessment of facility hazards. (3) A uniform methodology for hazard categorization under the Order. (4) Insight into the ''graded approach'' for SAR development, especially in hazard assessment and accident analysis techniques. Individual PSOs may develop additional guidance addressing safety requirements for facilities which fall below the threshold quantities specified in this document.

  12. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  13. Computer design and analysis of vacuum systems

    SciTech Connect

    Santeler, D.J.

    1987-07-01

    A computer program has been developed for an IBM compatible personal computer to assist in the design and analysis of vacuum systems. The program has a selection of 12 major schematics with several thousand minor variants incorporating diffusion, turbomolecular, cryogenic, ion, mechanical, and sorption pumps as well as circular tubes, bends, valves, traps, and purge gas connections. The gas throughput versus the inlet pressure of the pump is presented on a log--log graphical display. The conductance of each series component is sequentially added to the graph to obtain the net system behavior Q/sub (//sub P//sub )/. The component conductances may be calculated either from the inlet area and the transmission probability or from the tube length and the diameter. The gas-flow calculations are valid for orifices, short tubes, and long tubes throughout the entire pressure range from molecular through viscous to choked and nonchoked exit flows. The roughing-pump and high-vacuum-pump characteristic curves are numerically integrated to provide a graphical presentation of the system pumpdown. Outgassing data for different materials is then combined to produce a graph of the net system ''outgassing pressure.'' Computer routines are provided for differentiating a real pumpdown curve for system analysis. The computer program is included with the American Vacuum Society course, ''Advanced Vacuum System Design and Analysis,'' or it may be purchased from Process Applications, Inc.

  14. Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

  15. Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.

    2004-01-01

    A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

  16. The role of mitochondrial proteomic analysis in radiological accidents and terrorism.

    PubMed

    Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

    2013-01-01

    In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring. PMID:22879026

  17. Analysis of offsite Emergency Planning Zones (EPZs) for the Rocky Flats Plant. Phase 3, Sitewide spectrum-of-accidents and bounding EPZ analysis

    SciTech Connect

    Petrocchi, A.J.; Zimmerman, G.A.

    1994-03-14

    During Phase 3 of the EPZ project, a sitewide analysis will be performed applying a spectrum-of-accidents approach to both radiological and nonradiological hazardous materials release scenarios. This analysis will include the MCA but will be wider in scope and will produce options for the State of Colorado for establishing a bounding EPZ that is intended to more comprehensively update the interim, preliminary EPZ developed in Phase 2. EG&G will propose use of a hazards assessment methodology that is consistent with the DOE Emergency Management Guide for Hazards Assessments and other methods required by DOE orders. This will include hazards, accident, safety, and risk analyses. Using this methodology, EG&G will develop technical analyses for a spectrum of accidents. The analyses will show the potential effects from the spectrum of accidents on the offsite population together with identification of offsite vulnerable zones and areas of concern. These analyses will incorporate state-of-the-art technology for accident analysis, atmospheric plume dispersion modeling, consequence analysis, and the application of these evaluations to the general public population at risk. The analyses will treat both radiological and nonradiological hazardous materials and mixtures of both released accidentally to the atmosphere. DOE/RFO will submit these results to the State of Colorado for the State`s use in determining offsite emergency planning zones for the Rocky Flats Plant. In addition, the results will be used for internal Rocky Flats Plant emergency planning.

  18. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management

    NASA Astrophysics Data System (ADS)

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8 % of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced.

  19. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management.

    PubMed

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8% of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced. PMID:26739714

  20. An analysis of civil aviation propeller-to-person accidents: 1965-79.

    PubMed

    Collins, W E; Mastrullo, A R; Kirkham, W R; Taylor, D K; Grape, P M

    1982-05-01

    The interest of manufacturing, governmental, and safety personnel in using paint schemes on propeller and rotor blades is based on improving the visual conspicuity of those blades when they are rotating. While propeller and rotor paint schemes may serve to reduce the number of fatalities and injuries due to contact with a rotating blade, there is little information about the circumstances surrounding such accidents. Brief reports provided by the National Transportation Safety Board of all "propeller-to-person" accidents from 1965-79 were examined and analyzed in terms of airport lighting conditions, actions of pilots, actions of passengers and groundcrew, phase of flight operations, weather conditions, and others. Analyses based on 319 accidents showed a marked drop in the frequency of "propeller-to-person" accidents from 1975 through 1978. Several types of educational efforts directed toward pilots and groundcrew, both prior to and during that 4-year period, were examined as possible factors contributing to the accident rate decline. Accident patterns provide a basis for assessing the probable efficacy of various recommendations, including propeller conspicuity, for further reducing "propeller-to-person" accidents. PMID:7092754

  1. Computational strategies for tire monitoring and analysis

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.; Noor, Ahmed K.; Green, James S.

    1995-01-01

    Computational strategies are presented for the modeling and analysis of tires in contact with pavement. A procedure is introduced for simple and accurate determination of tire cross-sectional geometric characteristics from a digitally scanned image. Three new strategies for reducing the computational effort in the finite element solution of tire-pavement contact are also presented. These strategies take advantage of the observation that footprint loads do not usually stimulate a significant tire response away from the pavement contact region. The finite element strategies differ in their level of approximation and required amount of computer resources. The effectiveness of the strategies is demonstrated by numerical examples of frictionless and frictional contact of the space shuttle Orbiter nose-gear tire. Both an in-house research code and a commercial finite element code are used in the numerical studies.

  2. The effect of gamma-ray transport on afterheat calculations for accident analysis

    SciTech Connect

    Reyes, S.; Latkowski, J.F.; Sanz, J.

    2000-05-01

    Radioactive afterheat is an important source term for the release of radionuclides in fusion systems under accident conditions. Heat transfer calculations are used to determine time-temperature histories in regions of interest, but the true source term needs to be the effective afterheat, which considers the transport of penetrating gamma rays. Without consideration of photon transport, accident temperatures may be overestimated in others. The importance of this effect is demonstrated for a simple, one-dimensional problem. The significance of this effect depends strongly on the accident scenario being analyzed.

  3. The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

  4. AXAIR: A Computer Code for SAR Assessment of Plume-Exposure Doses from Potential Process-Accident Releases to Atmosphere

    SciTech Connect

    Pillinger, W.L.

    2001-05-17

    This report describes the AXAIR computer code which is available to terminal users for evaluating the doses to man from exposure to the atmospheric plume from postulated stack or building-vent releases at the Savannah River Plant. The emphasis herein is on documentation of the methodology only. The total-body doses evaluated are those that would be exceeded only 0.5 percent of the time based on worst-sector, worst-case meteorological probability analysis. The associated doses to other body organs are given in the dose breakdowns by radionuclide, body organ and pathway.

  5. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    NASA Technical Reports Server (NTRS)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  6. Computational analysis of forebody tangential slot blowing

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Agosta-Greenman, Roxana M.; Rizk, Yehia M.; Schiff, Lewis B.; Cummings, Russell M.

    1994-01-01

    An overview of the computational effort to analyze forebody tangential slot blowing is presented. Tangential slot blowing generates side force and yawing moment which may be used to control an aircraft flying at high-angle-of-attack. Two different geometries are used in the analysis: (1) The High Alpha Research Vehicle; and (2) a generic chined forebody. Computations using the isolated F/A-18 forebody are obtained at full-scale wind tunnel test conditions for direct comparison with available experimental data. The effects of over- and under-blowing on force and moment production are analyzed. Time-accurate solutions using the isolated forebody are obtained to study the force onset timelag of tangential slot blowing. Computations using the generic chined forebody are obtained at experimental wind tunnel conditions, and the results compared with available experimental data. This computational analysis compliments the experimental results and provides a detailed understanding of the effects of tangential slot blowing on the flow field about simple and complex geometries.

  7. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  8. Statistical Data Analysis in the Computer Age

    NASA Astrophysics Data System (ADS)

    Efron, Bradley; Tibshirani, Robert

    1991-07-01

    Most of our familiar statistical methods, such as hypothesis testing, linear regression, analysis of variance, and maximum likelihood estimation, were designed to be implemented on mechanical calculators. modern electronic computation has encouraged a host of new statistical methods that require fewer distributional assumptions than their predecessors and can be applied to more complicated statistical estimators. These methods allow the scientist to explore and describe data and draw valid statistical inferences without the usual concerns for mathematical tractability. This is possible because traditional methods of mathematical analysis are replaced by specially constructed computer algorithms. Mathematics has not disappeared from statistical theory. It is the main method for deciding which algorithms are correct and efficient tools for automating statistical inference.

  9. Lower head creep rupture failure analysis associated with alternative accident sequences of the Three Mile Island Unit 2

    SciTech Connect

    Sang Lung, Chan

    2004-07-01

    The objective of this lower head creep rupture analysis is to assess the current version of MELCOR 1.8.5-RG against SCDAP/RELAP5 MOD 3.3kz. The purpose of this assessment is to investigate the current MELCOR in-vessel core damage progression phenomena including the model for the formation of a molten pool. The model for stratified molten pool natural heat transfer will be included in the next MELCOR release. Presently, MELCOR excludes the gap heat-transfer model for the cooling associated with the narrow gap between the debris and the lower head vessel wall. All these phenomenological models are already treated in SCDAP/RELAP5 using the COUPLE code to model the heat transfer of the relocated debris with the lower head based on a two-dimensional finite-element-method. The assessment should determine if current MELCOR capabilities adequately cover core degradation phenomena appropriate for the consolidated MELCOR code. Inclusion of these features should bring MELCOR much closer to a state of parity with SCDAP/RELAP5 and is a currently underway element in the MELCOR code consolidation effort. This assessment deals with the following analysis of the Three Mile Island Unit 2 (TMI-2) alternative accident sequences. The TMI-2 alternative accident sequence-1 includes the continuation of the base case of the TMI-2 accident with the Reactor Coolant Pumps (RCP) tripped, and the High Pressure Injection System (HPIS) throttled after approximately 6000 s accident time, while in the TMI-2 alternative accident sequence-2, the reactor coolant pumps is tripped after 6000 s and the HPIS is activated after 12,012 s. The lower head temperature distributions calculated with SCDAP/RELAP5 are visualized and animated with open source visualization freeware 'OpenDX'. (author)

  10. An analysis of the consequences of accidents involving shipments of multiple Type A radioactive material (RAM) packages

    SciTech Connect

    Finley, N.C.; McClure, J.D.; Reardon, P.C.; Wangler, M.

    1989-01-01

    Comparing the results of the RADTRANIII calculations with a normalized set of results, both for incident-free transport and vehicular accident cases, the calculated consequences in the current analysis are lower. Even for the High-Activity Shipment, the total expected population dose from either incident-free transport or vehicular accidents is small, and smaller than that estimated in USNRC 1977. The results of the simulation in which parameters were varied randomly and independently indicate that, regardless of the input values assumed, the maximum total population dose from the High-Activity Shipment and the simultaneous occurrence of the least conservative value for each input parameter might be as high as 300 person-rem for a single shipment. The values for either of the other shipments (DOT Exemption or Common Carrier) would be significantly lower. The potential average individual radiation doses from accidents involving multiple Type A package shipments are comparable to the increase in the normal background radiation dose of 0.09 rem/person/year (90 mrem) that an individual would receive by moving from sea level to 5000 ft elevation. The maximum dose to an individual (one very near the accident scene) for the High Activity Shipment would be approximately 0.3 rem (300 mrem) in a maximum severity accident. This is within the individual dose guidelines outlined by NCRP (0.5 rem). Even at the high levels postulated for multiple package shipments under DOT controlled exemptions, the potential risks to the public in terms of expected population dose in the current analysis are below those already found to be acceptable. 4 refs., 3 tabs.

  11. Analysis of labour accidents in tunnel construction and introduction of prevention measures.

    PubMed

    Kikkawa, Naotaka; Itoh, Kazuya; Hori, Tomohito; Toyosawa, Yasuo; Orense, Rolando P

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  12. Source terms for analysis of accidents at a high level waste repository

    SciTech Connect

    Mubayi, V.; Davis, R.E.; Youngblood, R.

    1989-01-01

    This paper describes an approach to identifying source terms from possible accidents during the preclosure phase of a high-level nuclear waste repository. A review of the literature on repository safety analyses indicated that source term estimation is in a preliminary stage, largely based on judgement-based scoping analyses. The approach developed here was to partition the accident space into domains defined by certain threshold values of temperature and impact energy density which may arise in potential accidents and specify release fractions of various radionuclides, present in the waste form, in each domain. Along with a more quantitative understanding of accident phenomenology, this approach should help in achieving a clearer perspective on scenarios important to preclosure safety assessments of geologic repositories. 18 refs., 3 tabs.

  13. Traffic Analysis and Road Accidents: A Case Study of Hyderabad using GIS

    NASA Astrophysics Data System (ADS)

    Bhagyaiah, M.; Shrinagesh, B.

    2014-06-01

    Globalization has impacted many developing countries across the world. India is one such country, which benefited the most. Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. It is estimated that since 2001 there is an increase of 202 percent of two wheeler and 286 percent of four wheeler vehicles with no road expansion. Motor vehicle crashes are a common cause of death, disability and demand for emergency medical care. Globally, more than 1 million people die each year from traffic crashes and about 20-50 million are injured or permanently disabled. There has been increasing trend in road accidents in Hyderabad over a few years. GIS helps in locating the accident hotspots and also in analyzing the trend of road accidents in Hyderabad.

  14. Analysis of labour accidents in tunnel construction and introduction of prevention measures

    PubMed Central

    KIKKAWA, Naotaka; ITOH, Kazuya; HORI, Tomohito; TOYOSAWA, Yasuo; ORENSE, Rolando P.

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  15. Sensitivity analysis of a ship accident at a deep-ocean site in the northwest Atlantic

    SciTech Connect

    Kaplan, M.F.

    1985-04-01

    This report presents the results of a sensitivity analysis for an HLW ship accident occurring in the Nares Abyssal Plain in the northwestern Atlantic. Waste form release rate, canister lifetime and sorption in the water column (partition coefficients) were varied. Also investigated were the relative importance of the dose from the food chain and from seaweed in the diet. Peak individual doses and integrated collective doses for populations were the units of comparison. In accordance with international guidelines on radiological protection, the comparisons of different options were carried out over ''all time''; the study uses a million-year time frame. Partition coefficients have the most pronounced effect on collective dose of the parameters studied. Variations in partition coefficients affect the shape of the collective dose curve over the entire time frame. Peak individual doses decrease markedly when the value for the sorption of americium is increased, but show no increase when less sorption is assumed. Waste form release rates and canister lifetimes affect collective doses only in periods prior to 20,000 years. Hence, comparisons of these options need not be carried out beyond 20,000 years. Waste from release rates below 10/sup -3//yr (nominal value) affect individual doses in a linear manner, i.e., an order-of-magnitude reduction in release rate leads to an order-of-magnitude reduction in peak individual dose. Little reduction in peak individual doses is seen with canister lifetimes extended beyond the nominal 100 years. 32 refs., 14 figs., 16 tabs.

  16. THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT

    SciTech Connect

    Gupta, N.

    2011-02-14

    Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

  17. Identification of Behavior Based Safety by Using Traffic Light Analysis to Reduce Accidents

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Nasution, M. I.

    2016-01-01

    This work present the safety assessment of a case study and describes an important area within the field production in oil and gas industry, namely behavior based safety (BBS). The company set a rigorous BBS and its intervention program that implemented and deployed continually. In this case, observers requested to have discussion and spread a number of determined questions related with work behavior to the workers during observation. Appraisal of Traffic Light Analysis (TLA) as one tools of risk assessment used to determine the estimated score of BBS questionnaire. Standardization of TLA appraisal in this study are based on Regulation of Minister of Labor and Occupational Safety and Health No:PER.05/MEN/1996. The result shown that there are some points under 84%, which categorized in yellow category and should corrected immediately by company to prevent existing bad behavior of workers. The application of BBS expected to increase the safety performance at work time-by-time and effective in reducing accidents.

  18. Accident investigation

    NASA Technical Reports Server (NTRS)

    Laynor, William G. Bud

    1987-01-01

    The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

  19. Highway accident severities and the mixed logit model: an exploratory empirical analysis.

    PubMed

    Milton, John C; Shankar, Venky N; Mannering, Fred L

    2008-01-01

    Many transportation agencies use accident frequencies, and statistical models of accidents frequencies, as a basis for prioritizing highway safety improvements. However, the use of accident severities in safety programming has been often been limited to the locational assessment of accident fatalities, with little or no emphasis being placed on the full severity distribution of accidents (property damage only, possible injury, injury)-which is needed to fully assess the benefits of competing safety-improvement projects. In this paper we demonstrate a modeling approach that can be used to better understand the injury-severity distributions of accidents on highway segments, and the effect that traffic, highway and weather characteristics have on these distributions. The approach we use allows for the possibility that estimated model parameters can vary randomly across roadway segments to account for unobserved effects potentially relating to roadway characteristics, environmental factors, and driver behavior. Using highway-injury data from Washington State, a mixed (random parameters) logit model is estimated. Estimation findings indicate that volume-related variables such as average daily traffic per lane, average daily truck traffic, truck percentage, interchanges per mile and weather effects such as snowfall are best modeled as random-parameters-while roadway characteristics such as the number of horizontal curves, number of grade breaks per mile and pavement friction are best modeled as fixed parameters. Our results show that the mixed logit model has considerable promise as a methodological tool in highway safety programming. PMID:18215557

  20. Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.

    PubMed

    Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

    2005-01-01

    A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

  1. Nasal continuous positive airway pressure (nCPAP) treatment for obstructive sleep apnea, road traffic accidents and driving simulator performance: a meta-analysis.

    PubMed

    Antonopoulos, Constantine N; Sergentanis, Theodoros N; Daskalopoulou, Styliani S; Petridou, Eleni Th

    2011-10-01

    We used meta-analysis to synthesize current evidence regarding the effect of nasal continuous positive airway pressure (nCPAP) on road traffic accidents in patients with obstructive sleep apnea (OSA) as well as on their performance in driving simulator. The primary outcomes were real accidents, near miss accidents, and accident-related events in the driving simulator. Pooled odds ratios (ORs), incidence rate ratios (IRRs) and standardized mean differences (SMDs) were appropriately calculated through fixed or random effects models after assessing between-study heterogeneity. Furthermore, risk differences (RDs) and numbers needed to treat (NNTs) were estimated for real and near miss accidents. Meta-regression analysis was performed to examine the effect of moderator variables and publication bias was also evaluated. Ten studies on real accidents (1221 patients), five studies on near miss accidents (769 patients) and six studies on the performance in driving simulator (110 patients) were included. A statistically significant reduction in real accidents (OR=0.21, 95% CI=0.12-0.35, random effects model; IRR=0.45, 95% CI=0.34-0.59, fixed effects model) and near miss accidents (OR=0.09, 95% CI=0.04-0.21, random effects model; IRR=0.23, 95% CI=0.08-0.67, random effects model) was observed. Likewise, a significant reduction in accident-related events was observed in the driving simulator (SMD=-1.20, 95% CI=-1.75 to -0.64, random effects). The RD for real accidents was -0.22 (95% CI=-0.32 to -0.13, random effects), with NNT equal to five patients (95% CI=3-8), whereas for near miss accidents the RD was -0.47 (95% CI=-0.69 to -0.25, random effects), with NNT equal to two patients (95% CI=1-4). For near miss accidents, meta-regression analysis suggested that nCPAP seemed more effective among patients entering the studies with higher baseline accident rates. In conclusion, all three meta-analyses demonstrated a sizeable protective effect of nCPAP on road traffic accidents, both

  2. Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J.; Laub, T.W.

    1992-06-01

    This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

  3. Modeling and analysis of the unprotected loss-of-flow accident in the Clinch River Breeder Reactor

    SciTech Connect

    Morris, E.E.; Dunn, F.E.; Simms, R.; Gruber, E.E.

    1985-01-01

    The influence of fission-gas-driven fuel compaction on the energetics resulting from a loss-of-flow accident was estimated with the aid of the SAS3D accident analysis code. The analysis was carried out as part of the Clinch River Breeder Reactor licensing process. The TREAT tests L6, L7, and R8 were analyzed to assist in the modeling of fuel motion and the effects of plenum fission-gas release on coolant and clad dynamics. Special, conservative modeling was introduced to evaluate the effect of fission-gas pressure on the motion of the upper fuel pin segment following disruption. For the nominal sodium-void worth, fission-gas-driven fuel compaction did not adversely affect the outcome of the transient. When uncertainties in the sodium-void worth were considered, however, it was found that if fuel compaction occurs, loss-of-flow driven transient overpower phenomenology could not be precluded.

  4. Aerodynamic analysis of Pegasus - Computations vs reality

    NASA Technical Reports Server (NTRS)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  5. Semiconductor Device Analysis on Personal Computers

    1993-02-08

    PC-1D models the internal operation of bipolar semiconductor devices by solving for the concentrations and quasi-one-dimensional flow of electrons and holes resulting from either electrical or optical excitation. PC-1D uses the same detailed physical models incorporated in mainframe computer programs, yet runs efficiently on personal computers. PC-1D was originally developed with DOE funding to analyze solar cells. That continues to be its primary mode of usage, with registered copies in regular use at more thanmore » 100 locations worldwide. The program has been successfully applied to the analysis of silicon, gallium-arsenide, and indium-phosphide solar cells. The program is also suitable for modeling bipolar transistors and diodes, including heterojunction devices. Its easy-to-use graphical interface makes it useful as a teaching tool as well.« less

  6. Meaningful statistical analysis of large computational clusters.

    SciTech Connect

    Gentile, Ann C.; Marzouk, Youssef M.; Brandt, James M.; Pebay, Philippe Pierre

    2005-07-01

    Effective monitoring of large computational clusters demands the analysis of a vast amount of raw data from a large number of machines. The fundamental interactions of the system are not, however, well-defined, making it difficult to draw meaningful conclusions from this data, even if one were able to efficiently handle and process it. In this paper we show that computational clusters, because they are comprised of a large number of identical machines, behave in a statistically meaningful fashion. We therefore can employ normal statistical methods to derive information about individual systems and their environment and to detect problems sooner than with traditional mechanisms. We discuss design details necessary to use these methods on a large system in a timely and low-impact fashion.

  7. Computational frameworks for discrete Gabor analysis

    NASA Astrophysics Data System (ADS)

    Strohmer, Thomas

    1997-10-01

    The Gabor transform yields a discrete representation of a signal in the phase space. Since the Gabor transform is non-orthogonal, efficient reconstruction of a signal from its phase space samples is not straightforward and involves the computation of the so- called dual Gabor function. We present a unifying approach to the derivation of numerical algorithms for discrete Gabor analysis, based on unitary matrix factorization. The factorization point of view is notably useful for the design of efficient numerical algorithms. This presentation is the first systematic account of its kind. In particular, it is shown that different algorithms for the computation of the dual window correspond to different factorizations of the frame operator. Simple number theoretic conditions on the time-frequency lattice parameters imply additional structural properties of the frame operator.

  8. Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine

    SciTech Connect

    Georgievskiy, Vladimir

    2007-07-01

    It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following

  9. Testing and analysis of structural integrity of electrosleeved tubes under severe accident transients

    SciTech Connect

    Majumdar, S.

    1999-12-10

    The structural integrity of flawed steam generator tubing with Electrosleeves{trademark} under simulated severe accident transients was analyzed by analytical models that used available material properties data and results from high-temperature tests conducted on Electrosleeved tubes. The Electrosleeve material is almost pure Ni and derives its strength and other useful properties from its nanocrystalline microstructure, which is stable at reactor operating temperatures. However, it undergoes rapid grain growth, at the high temperatures expected during severe accidents, resulting in a loss of strength and a corresponding decrease in flow stress. The magnitude of this decrease depends on the time-temperature history during the accident. Failure tests were conducted at ANL and FTI on internally pressurized Electrosleeved tubes with 80% and 100% throughwall machined axial notches in tie parent tubes that were subjected to simulated severe accident temperature transients. The test results, together with the analytical model, were used to estimate the unaged flow stress curve of the Electrosleeved material at high temperatures. Failure temperatures for Electrosleeved tubes with throughwall and part-throughwall axial cracks of various lengths in the parent tubes were calculated for a postulated severe accident transient.

  10. Accident analysis of large-scale technological disasters applied to an anaesthetic complication.

    PubMed

    Eagle, C J; Davies, J M; Reason, J

    1992-02-01

    The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

  11. Analysis of 121 fatal passenger car-adult pedestrian accidents in China.

    PubMed

    Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

    2014-10-01

    To study the characteristics of fatal vehicle-pedestrian accidents in China,a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

  12. APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

  13. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  14. Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models

    SciTech Connect

    S. A. Eide; D. M. Rasmuson; C. L. Atwood

    2008-09-01

    The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

  15. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  16. Analysis of hospitalization occurred due to motorcycles accidents in São Paulo city

    PubMed Central

    Gorios, Carlos; Armond, Jane de Eston; Rodrigues, Cintia Leci; Pernambuco, Henrique; Iporre, Ramiro Ortiz; Colombo-Souza, Patrícia

    2015-01-01

    OBJECTIVE: To characterize the motorcycle accidents occurred in the city of São Paulo, SP, Brazil in the year 2013, with emphasis on information about hospital admissions from SIH/SUS. METHODS: This is a retrospective cross-sectional study. The study covered 5,597 motorcyclists traumatized in traffic accident during the year 2013 occurred in the city of São Paulo. A survey was conducted using secondary data from the Information System of Hospitalization Health System (SIH/SUS). RESULTS: In 2013, in the city of São Paulo there were 5,597 admissions of motorcyclists traumatized in traffic accidents, of which 89.8% were male. The admission diagnosis were: leg fracture, femur fracture, and intracranial injury. CONCLUSION: This study confirms other preliminary studies on several points, among which stands out the highest prevalence of male young adults. Level of Evidence II, Retrospective Study. PMID:26327804

  17. A Comprehensive Analysis of the X-15 Flight 3-65 Accident

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

    2014-01-01

    The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

  18. Comparative analysis of social, demographic, and flight-related attributes between accident and nonaccident general aviation pilots.

    PubMed

    Urban, R F

    1984-04-01

    This investigation represents an exploratory examination of several differentiating social and demographic characteristics for a sample of calendar year 1978 Colorado-resident nonfatal accident-involved pilots and a random sample of nonaccident general aviation (i.e., nonairline) pilots. During 1979-1980 80 currently active pilots were interviewed by the author, and information concerning the standard demographic variables, in addition to several social, psychological, and flying-related items, was obtained. The sample was generated from commercially available data files derived from U.S. Government records and consisted of 46 accident and 34 nonaccident pilots who resided within a 100-mi radius of Denver, east of the Rocky Mountains. Descriptively, the respondents represented a broad spectrum of general aviation, including: corporate pilots, "crop dusters," builders of amateur experimental aircraft, and recreational fliers. Application of stepwise discriminant analysis revealed that the pilots' education, political orientation, birth order, percent of flying for business purposes, participation in nonflying aviation activities, number of years of flying experience, and an index of aviation procedural noncompliance yielded statistically significant results. Furthermore, utilization of the classification capability of discriminant analysis produced a mathematical function which correctly allocated 78.5% of the cases into the appropriate groups, thus contributing to a 56.5% proportionate reduction in error over a random effects model. No relationship was found between accident involvement and several indicators of social attachments, socioeconomic status, and a number of measures of flying exposure. PMID:6732683

  19. Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.

    PubMed

    Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

    2011-07-01

    The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. PMID:21608109

  20. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  1. Analysis of loss-of-coolant and loss-of-flow accidents in the first wall cooling system of NET/ITER

    NASA Astrophysics Data System (ADS)

    Komen, E. M. J.; Koning, H.

    1994-03-01

    This paper presents the thermal-hydraulic analysis of potential accidents in the first wall cooling system of the Next European Torus or the International Thermonuclear Experimental Reactor. Three ex-vessel loss-of-coolant accidents, two in-vessel loss-of-coolant accidents, and three loss-of-flow accidents have been analyzed using the thermal-hydraulic system analysis code RELAP5/MOD3. The analyses deal with the transient thermal-hydraulic behavior inside the cooling systems and the temperature development inside the nuclear components during these accidents. The analysis of the different accident scenarios has been performed without operation of emergency cooling systems. The results of the analyses indicate that a loss of forced coolant flow through the first wall rapidly causes dryout in the first wall cooling pipes. Following dryout, melting in the first wall starts within about 130 s in case of ongoing plasma burning. In case of large break LOCAs and ongoing plasma burning, melting in the first wall starts about 90 s after accident initiation.

  2. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  3. What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?

    PubMed

    Tivesten, Emma; Wiberg, Henrik

    2013-03-01

    Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such

  4. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  5. Good relationships between computational image analysis and radiological physics

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  6. Accident analysis and control options in support of the sludge water system safety analysis

    SciTech Connect

    HEY, B.E.

    2003-01-16

    A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have been incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.

  7. Severe Accident Sequence Analysis Program: Anticipated transient without scram simulations for Browns Ferry Nuclear Plant Unit 1

    SciTech Connect

    Dallman, R J; Gottula, R C; Holcomb, E E; Jouse, W C; Wagoner, S R; Wheatley, P D

    1987-05-01

    An analysis of five anticipated transients without scram (ATWS) was conducted at the Idaho National Engineering Laboratory (INEL). The five detailed deterministic simulations of postulated ATWS sequences were initiated from a main steamline isolation valve (MSIV) closure. The subject of the analysis was the Browns Ferry Nuclear Plant Unit 1, a boiling water reactor (BWR) of the BWR/4 product line with a Mark I containment. The simulations yielded insights to the possible consequences resulting from a MSIV closure ATWS. An evaluation of the effects of plant safety systems and operator actions on accident progression and mitigation is presented.

  8. Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II

    NASA Astrophysics Data System (ADS)

    Hu, G.; Zhao, S.; Ruan, K.

    2012-01-01

    In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

  9. Analysis of general aviation accidents during operations under instrument flight rules

    NASA Technical Reports Server (NTRS)

    Bennett, C. T.; Schwirzke, Martin; Harm, C.

    1990-01-01

    A report is presented to describe some of the errors that pilots make during flight under IFR. The data indicate that there is less risk during the approach and landing phase of IFR flights, as compared to VFR operations. Single-pilot IFR accident rates continue to be higher than two-pilot IFR incident rates, reflecting the high work load of IFR operations.

  10. Traffic accident in Cuiabá-MT: an analysis through the data mining technology.

    PubMed

    Galvão, Noemi Dreyer; de Fátima Marin, Heimar

    2010-01-01

    The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males. PMID:20841739

  11. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  12. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    SciTech Connect

    Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

  13. Incorporation of phenomenological uncertainties in probabilistic safety analysis - application to LMFBR core disruptive accident energetics

    SciTech Connect

    Najafi, B; Theofanous, T G; Rumble, E T; Atefi, B

    1984-08-01

    This report describes a method for quantifying frequency and consequence uncertainty distribution associated with core disruptive accidents (CDAs). The method was developed to estimate the frequency and magnitude of energy impacting the reactor vessel head of the Clinch River Breeder Plant (CRBRP) given the occurrence of hypothetical CDAs. The methodology is illustrated using the CRBR example.

  14. Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.

    ERIC Educational Resources Information Center

    Dunwoody, Sharon; And Others

    Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

  15. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed Central

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-01-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  16. A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C, M.

    2007-01-01

    Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

  17. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  18. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    SciTech Connect

    Gauntt, Randall O.; Mattie, Patrick D.

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  19. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  20. Distributed Design and Analysis of Computer Experiments

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  1. A system analysis computer model for the High Flux Isotope Reactor (HFIRSYS Version 1)

    SciTech Connect

    Sozer, M.C.

    1992-04-01

    A system transient analysis computer model (HFIRSYS) has been developed for analysis of small break loss of coolant accidents (LOCA) and operational transients. The computer model is based on the Advanced Continuous Simulation Language (ACSL) that produces the FORTRAN code automatically and that provides integration routines such as the Gear`s stiff algorithm as well as enabling users with numerous practical tools for generating Eigen values, and providing debug outputs and graphics capabilities, etc. The HFIRSYS computer code is structured in the form of the Modular Modeling System (MMS) code. Component modules from MMS and in-house developed modules were both used to configure HFIRSYS. A description of the High Flux Isotope Reactor, theoretical bases for the modeled components of the system, and the verification and validation efforts are reported. The computer model performs satisfactorily including cases in which effects of structural elasticity on the system pressure is significant; however, its capabilities are limited to single phase flow. Because of the modular structure, the new component models from the Modular Modeling System can easily be added to HFIRSYS for analyzing their effects on system`s behavior. The computer model is a versatile tool for studying various system transients. The intent of this report is not to be a users manual, but to provide theoretical bases and basic information about the computer model and the reactor.

  2. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  3. Computed tomographic analysis of meteorite inclusions

    NASA Technical Reports Server (NTRS)

    Arnold, J. R.; Testa, J. P., Jr.; Friedman, P. J.; Kambic, G. X.

    1983-01-01

    The feasibility of obtaining nondestructively a cross-sectional display of very dense heterogeneous rocky specimens, whether lunar, terrestrial or meteoritic, by using a fourth generation computed tomographic (CT) scanner, with modifications to the software only, is discussed. A description of the scanner, and of the experimental and analytical procedures is given. Using this technique, the interior of heterogeneous materials such as Allende can be probed nondestructively. The regions of material with high and low atomic numbers are displayed quickly; the object can then be cut to obtain for analysis just the areas of interest. A comparison of this technique with conventional industrial and medical techniques is made in terms of image resolution and density distribution display precision.

  4. Computational based functional analysis of Bacillus phytases.

    PubMed

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  5. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  6. Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

    2014-05-01

    Science and Education on June, 2011. So far more than 500 samples were measured and determined I-129 deposition amount by AMS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. The measurement error from AMS is less than 5%, typically 3%. The overall uncertainty is estimated less than 30%, including the uncertainty from that of the nominal value of the standard reference material used, that of I-129/I-131 ratio estimation, that of the "representativeness" for the region by the analyzed sample, etc. The isotopic ratio I-129/I-131 from the reactor was estimated [3] (to be 22.3 +- 6.3 as of March 11, 2011) from a series of samples collected by a group of The University of Tokyo on the 20th of April, 2011 for which the I-131 was determined by gamma-ray spectrometry with good precision. Complementarily, we had investigated the depth profile in soil of the accident derived I-129 and migration speed after the deposition and found that more than 90% of I-129 was concentrated within top 5 cm layer and the downward migration speed was less than 1cm/yr [4]. From the set of I-129 data, corresponding I-131 were calculated and the distribution map is going to be constructed. Various fine structures of the distribution came in sight. [1] Y. Nikiforov and D. R. Gnepp, 1994, Cancer, Vol. 47, pp748-766. [2] T. Straume, et al., 1996, Health Physics, Vol. 71, pp733-740. [3] Y. Miyake, H. Matsuzaki et al.,2012, Geochem. J., Vol. 46, pp327-333. [4] M. Honda, H. Matsuzaki et al., under submission.

  7. Loss of DHR sequences at Browns Ferry Unit One - accident-sequence analysis

    SciTech Connect

    Cook, D.H.; Grene, S.R.; Harrington, R.M.; Hodge, S.A.

    1983-05-01

    This study describes the predicted response of Unit One at the Browns Ferry Nuclear Plant to a postulated loss of decay heat removal (DHR) capability following scram from full power with the power conversion system unavailable. In accident sequences without DHR capability, the residual heat removal (RHR) system functions of pressure suppression pool cooling and reactor vessel shutdown cooling are unavailable. Consequently, all decay heat energy is stored in the pressure suppression pool with a concomitant increase in pool temperature and primary containment pressure. With the assumption that DHR capability is not regained during the lengthy course of this accident sequence, the containment ultimately fails by overpressurization. Although unlikely, this catastrophic failure might lead to loss of the ability to inject cooling water into the reactor vessel, causing subsequent core uncovery and meltdown. The timing of these events and the effective mitigating actions that might be taken by the operator are discussed in this report.

  8. 3D analysis of the reactivity insertion accident in VVER-1000

    SciTech Connect

    Abdullayev, A. M.; Zhukov, A. I.; Slyeptsov, S. M.

    2012-07-01

    Fuel parameters such as peak enthalpy and temperature during rod ejection accident are calculated. The calculations are performed by 3D neutron kinetics code NESTLE and 3D thermal-hydraulic code VIPRE-W. Both hot zero power and hot full power cases were studied for an equilibrium cycle with Westinghouse hex fuel in VVER-1000. It is shown that the use of 3D methodology can significantly increase safety margins for current criteria and met future criteria. (authors)

  9. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  10. SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    Brad J. Merrill; Shannon M Bragg-Sitton

    2013-09-01

    The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

  11. Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

    2014-03-01

    The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

  12. An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident

    SciTech Connect

    El-Genk, M.S.; Paramonov, D. )

    1993-01-10

    An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

  13. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  14. Computer Analysis of a Physical Pendulum.

    ERIC Educational Resources Information Center

    Priest, Joseph; Potts, Larry

    1990-01-01

    The interfacing of a physical pendulum to an Apple IIe computer and the physic instruction associated with it are discussed. Laboratory procedures, software commands, and computations used in this lesson are described. (CW)

  15. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  16. Computing in Qualitative Analysis: A Healthy Development?

    ERIC Educational Resources Information Center

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  17. The role of computed tomography in the diagnosis of arterial gas embolism in fatal diving accidents in Tasmania.

    PubMed

    Oliver, J; Lyons, T J; Harle, R

    1999-02-01

    Four cases of fatal diving accidents in Tasmania are presented, highlighting the role of CT in the investigation of diving fatalities. The CT technique allows rapid diagnosis when arterial gas embolism (AGE) is suspected. The traditional method of investigation, underwater autopsy, is a difficult procedure that requires specialized training in which the subtle diagnosis of AGE may be completely missed. Facilities for performing underwater autopsies are normally available only in tertiary referral centres, and therefore the diagnosis of AGE may be missed due to lack of facilities. The use of CT in the diagnosis of AGE in divers was first utilized in the early 1980s but has still not become widely adopted in forensic practice. This radiological technique has the advantage of being sensitive, quick, reliable, readily available and provides a permanent record. For hospitals that do not have a resident forensic pathologist, a CT scan can be easily performed and interpreted to eliminate the possibility of AGE. There are a number of pitfalls in the diagnosis of AGE with CT, particularly intravascular gas production following postmortem fermentation and off-gassing. Awareness of these pitfalls will help the radiologist in making a correct diagnosis of AGE. PMID:10901868

  18. Ferrofluids: Modeling, numerical analysis, and scientific computation

    NASA Astrophysics Data System (ADS)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  19. TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.

    1994-01-01

    The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters

  20. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  1. Analysis of Japanese Radionuclide Monitoring Data of Food Before and After the Fukushima Nuclear Accident

    PubMed Central

    2015-01-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima 137Cs and 90Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in 137Cs and vegetarian produce was usually higher in 90Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of 90Sr being 10% of the respective 137Cs concentrations may soon be at risk, as the 90Sr/137Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the 90Sr content of Japanese foods. PMID:25621976

  2. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  3. Environmental studies: Mathematical, computational, and statistical analysis

    SciTech Connect

    Wheeler, M.F.

    1996-12-31

    The Summer Program on Mathematical, Computational, and Statistical Analyses in Environmental Studies held 6--31 July 1992 was designed to provide a much needed interdisciplinary forum for joint exploration of recent advances in the formulation and application of (A) environmental models, (B) environmental data and data assimilation, (C) stochastic modeling and optimization, and (D) global climate modeling. These four conceptual frameworks provided common themes among a broad spectrum of specific technical topics at this workshop. The program brought forth a mix of physical concepts and processes such as chemical kinetics, atmospheric dynamics, cloud physics and dynamics, flow in porous media, remote sensing, climate statistical, stochastic processes, parameter identification, model performance evaluation, aerosol physics and chemistry, and data sampling together with mathematical concepts in stiff differential systems, advective-diffusive-reactive PDEs, inverse scattering theory, time series analysis, particle dynamics, stochastic equations, optimal control, and others. Nineteen papers are presented in this volume. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  4. [Computational genome analysis of three marine algoviruses].

    PubMed

    Stepanova, O A; Boĭko, A L; Shcherbatenko, I S

    2013-01-01

    Computational analysis of genomic sequences of three new marine algoviruses: Tetraselmis viridis virus (TvV-S20 and TvV-SI1 strains) and Dunaliella viridis virus (DvV-SI2 strain) was conducted. Both considerable similarity and essential distinctions between studied strains and the most studied marine algoviruses of Phycodnaviridae family were revealed. Our data show that the tested strains are new viruses with the following features: only they were isolated from marine eukaryotic microalgae T. viridis and D. viridis, coding sequences (CDSs) of their genomes are localized mainly on one of the DNA strands and form several clusters with short intergenic spaces; there are considerable variations in genome structure within viruses and their strains; viral genomic DNA has a high GC-content (55.5 - 67.4%); their genes contain no well-known optimal contexts of translation start codones, and the contexts of terminal codons read-through; the vast majority of viral genes and proteins do not have any matches in gene banks. PMID:24479317

  5. Computer-aided petrographic analysis of sandstones

    SciTech Connect

    Thayer, P.A.; Helmold, K.P.

    1987-05-01

    Thin-section point counting, mathematical and statistical analysis of petrographic-petrophysical data, report generation, and graphical presentation of results can be done efficiently by computer. Compositional and textural data are collected with a modified Schares point-counting system. The system uses an MS-DOS microcomputer programmed in BASIC to drive a motorized stage attached to a polarizing microscope. Numeric codes for up to 500 different categories of minerals, cements, pores, etc, are input using a separate keypad. Calculation and printing of constituent percentages, QFR, Folk name, and grain-size distribution are completed in seconds after data entry. Raw data files, compatible with software such as Lotus 1-2-3, SPSS, and SAS, are stored on floppy disk. Petrographic data files are transferred directly to a mainframe, merged with log and petrophysical data, analyzed statistically with SAS, and reports generated. SAS/GRAPH and TELL-A-GRAF routines linked with SAS generate a variety of cross plots, histograms, pie and bar charts, ternary diagrams, and vertical variation diagrams (e.g., depth vs. porosity, permeability, mean size, sorting, and percent grains-matrix-cement).

  6. Analysis of a small break loss-of-coolant accident of pressurized water reactor by APROS

    SciTech Connect

    Al-Falahi, A.; Haennine, M.; Porkholm, K.

    1995-09-01

    The purpose of this paper is to study the capability of APROS (Advanced PROcess Simulator) code to simulate the real plant thermal-hydraulic transient of a Small Break Loss-Of-Coolant Accident (SBLOCA) of Loss-Of-Fluid Test (LOFT) facility. The LOFT is a scaled model of a Pressurized Water Reactor (PWR). This work is a part of a larger validation of the APROS thermal-hydraulic models. The results of SBLOCA transient calculated by APROS showed a reasonable agreement with the measured data.

  7. Analysis of reactivity-insertion accidents in the TREAT Upgrade reactor

    SciTech Connect

    Rudolph, R.R.; Bhattacharyya, S.K.

    1983-01-01

    The expansion of the experimental capabilities of the TREAT Upgrade (TU) reactor also tends to increase the potential risks associated with off-normal reactivity insertion incidents compared to the TREAT reactor. To provide adequate prtection for the public and the facility, while meeting experimenter's requirements, a specialized Reactor Trip System (RTS) with energy-dependent scram trips on reactor power and period has been developed. With this protection strategy, the consequences of reactivity insertion accidents in the TU reactor have been analyzed using a general methodology developed earlier. Results of these analyses are presented.

  8. A study of carburetor/induction system icing in general aviation accidents

    NASA Technical Reports Server (NTRS)

    Obermayer, R. W.; Roe, W. T.

    1975-01-01

    An assessment of the frequency and severity of carburetor/induction icing in general-aviation accidents was performed. The available literature and accident data from the National Transportation Safety Board were collected. A computer analysis of the accident data was performed. Between 65 and 90 accidents each year involve carburetor/induction system icing as a probable cause/factor. Under conditions conducive to carburetor/induction icing, between 50 and 70 percent of engine malfunction/failure accidents (exclusive of those due to fuel exhaustion) are due to carburetor/induction system icing. Since the evidence of such icing may not remain long after an accident, it is probable that the frequency of occurrence of such accidents is underestimated; therefore, some extrapolation of the data was conducted. The problem of carburetor/induction system icing is particularly acute for pilots with less than 1000 hours of total flying time. The severity of such accidents is about the same as any accident resulting from a forced landing or precautionary landing. About 144 persons, on the average, are exposed to death and injury each year in accidents involving carburetor/induction icing as a probable cause/factor.

  9. Parallel Analysis and Visualization on Cray Compute Node Linux

    SciTech Connect

    Pugmire, Dave; Ahern, Sean

    2008-01-01

    Capability computer systems are deployed to give researchers the computational power required to investigate and solve key challenges facing the scientific community. As the power of these computer systems increases, the computational problem domain typically increases in size, complexity and scope. These increases strain the ability of commodity analysis and visualization clusters to effectively perform post-processing tasks and provide critical insight and understanding to the computed results. An alternative to purchasing increasingly larger, separate analysis and visualization commodity clusters is to use the computational system itself to perform post-processing tasks. In this paper, the recent successful port of VisIt, a parallel, open source analysis and visualization tool, to compute node linux running on the Cray is detailed. Additionally, the unprecedented ability of this resource for analysis and visualization is discussed and a report on obtained results is presented.

  10. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  11. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  12. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-04-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations performed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  13. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-09-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations preformed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  14. Analysis of injuries among pilots involved in fatal general aviation airplane accidents.

    PubMed

    Wiegmann, Douglas A; Taneja, Narinder

    2003-07-01

    The purpose of this study was to analyze patterns of injuries sustained by pilots involved in fatal general aviation (GA) airplane accidents. Detailed information on the pattern and nature of injuries was retrieved from the Federal Aviation Administration's autopsy database for pilots involved in fatal GA airplane accidents from 1996 to 1999. A review of 559 autopsies revealed that blunt trauma was the primary cause of death in 86.0% (N=481) of the autopsies. The most commonly occurring bony injuries were fracture of the ribs (72.3%), skull (55.1%), facial bones (49.4%), tibia (37.9%) and pelvis (36.0%). Common organ injuries included laceration of the liver (48.1%), lung (37.6%) heart (35.6%), and spleen (30.1%), and hemorrhage of the brain (33.3%) and lung (32.9%). A fractured larynx was observed in 14.7% of the cases, a finding that has not been reported in literature until now. It was observed that individuals who sustained brain hemorrhage were also more likely to have fractures of the facial bones rather than skull fractures. PMID:12729820

  15. NASA Applications for Computational Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.

    2011-01-01

    Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.

  16. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  17. Transonic wing analysis using advanced computational methods

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  18. Frequency Analysis Program for a Computer Assisted Laboratory.

    ERIC Educational Resources Information Center

    Aburdene, Maurice F.

    1983-01-01

    Describes a Fortran program used in a computer-assisted-laboratory course. Program utilizes computer-controlled frequency sweeping to measure response (amplitude/phase) of a series RLC circuit, modeling the circuit and comparing experimental/theoretical results for system gain with computer gain using least squares analysis. Plots of both gain…

  19. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  20. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    SciTech Connect

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  1. Emulation and Sobol' sensitivity analysis of an atmospheric dispersion model applied to the Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Mallet, Vivien; Korsakissok, Irène; Mathieu, Anne

    2016-04-01

    Simulations of the atmospheric dispersion of radionuclides involve large uncertainties originating from the limited knowledge of meteorological input data, composition, amount and timing of emissions, and some model parameters. The estimation of these uncertainties is an essential complement to modeling for decision making in case of an accidental release. We have studied the relative influence of a set of uncertain inputs on several outputs from the Eulerian model Polyphemus/Polair3D on the Fukushima case. We chose to use the variance-based sensitivity analysis method of Sobol'. This method requires a large number of model evaluations which was not achievable directly due to the high computational cost of Polyphemus/Polair3D. To circumvent this issue, we built a mathematical approximation of the model using Gaussian process emulation. We observed that aggregated outputs are mainly driven by the amount of emitted radionuclides, while local outputs are mostly sensitive to wind perturbations. The release height is notably influential, but only in the vicinity of the source. Finally, averaging either spatially or temporally tends to cancel out interactions between uncertain inputs.

  2. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    SciTech Connect

    Evans, J.S.; Abrahmson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  3. Methods for nuclear air-cleaning-system accident-consequence assessment

    SciTech Connect

    Andrae, R.W.; Bolstad, J.W.; Gregory, W.S.

    1982-01-01

    This paper describes a multilaboratory research program that is directed toward addressing many questions that analysts face when performing air cleaning accident consequence assessments. The program involves developing analytical tools and supportive experimental data that will be useful in making more realistic assessments of accident source terms within and up to the atmospheric boundaries of nuclear fuel cycle facilities. The types of accidents considered in this study includes fires, explosions, spills, tornadoes, criticalities, and equipment failures. The main focus of the program is developing an accident analysis handbook (AAH). We will describe the contents of the AAH, which include descriptions of selected nuclear fuel cycle facilities, process unit operations, source-term development, and accident consequence analyses. Three computer codes designed to predict gas and material propagation through facility air cleaning systems are described. These computer codes address accidents involving fires (FIRAC), explosions (EXPAC), and tornadoes (TORAC). The handbook relies on many illustrative examples to show the analyst how to approach accident consequence assessments. We will use the FIRAC code and a hypothetical fire scenario to illustrate the accident analysis capability.

  4. The Gulf of Mexico oil rig accident: analysis by different SAR satellite images

    NASA Astrophysics Data System (ADS)

    Del Frate, Fabio; Giacomini, Andrea; Latini, Daniele; Solimini, Domenico; Emery, William J.

    2011-11-01

    The management of the monitoring oil spills over the sea surface is a very important and actual task for international environmental agencies, due to the continuous risks represented by possible accidents involving either rigs or tankers. On the other hand the increase of remote sensing space missions can definitely improve our capabilities in this kind of activity. In this paper we consider the dramatic Gulf of Mexico oil spill event of 2010 to investigate on the types of information that could be provided by the available SAR images collection which included different polarizations and bands. With an eye to the implementation of fully automatic processing chains, an assessment of a novel segmentation technique based on PCNN (Pulse Coupled Neural Networks) was also carried out.

  5. TRAC large-break loss-of-coolant accident analysis for the AP600 design

    SciTech Connect

    Lime, J.F.; Boyack, B.E.

    1994-02-01

    This report discusses a TRAC model of the Westinghouse AP600 advanced reactor design which has been developed for analyzing large-break loss-of-coolant accident (LBLOCA) transients. A preliminary LBLOCA calculation of a 80% cold-leg break has been performed with TRAC-PF1/MOD2. The 80% break size was calculated by Westinghouse to be the most severe large-break size. The LBLOCA transient was calculated to 92 s. Peak clad temperatures (PCT) were well below the Appendix K limit of 1478 K (2200{degrees}F). Transient event times and PCT for the TRAC calculation were in reasonable agreement with those calculated by Westinghouse using their WCOBRA/TRAC code.

  6. Radiological health effects models for nuclear power plant accident consequence analysis.

    PubMed

    Evans, J S; Moeller, D W

    1989-04-01

    Improved health effects models have been developed for assessing the early effects, late somatic effects and genetic effects that might result from low-LET radiation exposures to populations following a major accident in a nuclear power plant. All the models have been developed in such a way that the dynamics of population risks can be analyzed. Estimates of life years lost and the duration of illnesses were generated and a framework recommended for summarizing health impacts. Uncertainty is addressed by providing models for upper, central and lower estimates of most effects. The models are believed to be a significant improvement over the models used in the U.S. Nuclear Regulatory Commission's Reactor Safety Study, and they can easily be modified to reflect advances in scientific understanding of the health effects of ionizing radiation. PMID:2925380

  7. Alternative Computational Approaches for Probalistic Fatigue Analysis

    NASA Technical Reports Server (NTRS)

    Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Moore, N. R.; Grigoriu, M.

    1995-01-01

    The feasibility is discussed for alternative methods of direct Monte Carlo simulation for failure probability computations. First and second order reliability methods are used for fatigue crack growth and low cycle fatigue structural failure modes to illustrate typical problems.

  8. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  9. TRANSIENT ACCIDENT ANALYSIS OF THE GLOVEBOX SYSTEM IN A LARGE PROCESS ROOM

    SciTech Connect

    Lee, S

    2008-01-11

    Local transient hydrogen concentrations were evaluated inside a large process room when the hydrogen gas was released by three postulated accident scenarios associated with the process tank leakage and fire leading to a loss of gas confinement. The three cases considered in this work were fire in a room, loss of confinement from a process tank, and loss of confinement coupled with fire event. Based on these accident scenarios in a large and unventilated process room, the modeling calculations of the hydrogen migration were performed to estimate local transient concentrations of hydrogen due to the sudden leakage and release from a glovebox system associated with the process tank. The modeling domain represented the major features of the process room including the principal release or leakage source of gas storage system. The model was benchmarked against the literature results for key phenomena such as natural convection, turbulent behavior, gas mixing due to jet entrainment, and radiation cooling because these phenomena are closely related to the gas driving mechanisms within a large air space of the process room. The modeling results showed that at the corner of the process room, the gas concentrations migrated by the Case 2 and Case 3 scenarios reached the set-point value of high activity alarm in about 13 seconds, while the Case 1 scenario takes about 90 seconds to reach the concentration. The modeling results were used to estimate transient radioactive gas migrations in an enclosed process room installed with high activity alarm monitor when the postulated leakage scenarios are initiated without room ventilation.

  10. System balance analysis for vector computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Poole, W. G., Jr.; Voight, R. G.

    1975-01-01

    The availability of vector processors capable of sustaining computing rates of 10 to the 8th power arithmetic results pers second raised the question of whether peripheral storage devices representing current technology can keep such processors supplied with data. By examining the solution of a large banded linear system on these computers, it was found that even under ideal conditions, the processors will frequently be waiting for problem data.

  11. Three parallel computation methods for structural vibration analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf; Bostic, Susan; Patrick, Merrell; Mahajan, Umesh; Ma, Shing

    1988-01-01

    The Lanczos (1950), multisectioning, and subspace iteration sequential methods for vibration analysis presently used as bases for three parallel algorithms are noted, in the aftermath of three example problems, to maintain reasonable accuracy in the computation of vibration frequencies. Significant computation time reductions are obtained as the number of processors increases. An analysis is made of the performance of each method, in order to characterize relative strengths and weaknesses as well as to identify those parameters that most strongly affect computation efficiency.

  12. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD. PMID:26687539

  13. Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory

    SciTech Connect

    Kim, S.H.; Taleyarkhan, R.P.

    1994-01-01

    A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

  14. Computational analysis of LDDMM for brain mapping

    PubMed Central

    Ceritoglu, Can; Tang, Xiaoying; Chow, Margaret; Hadjiabadi, Darian; Shah, Damish; Brown, Timothy; Burhanullah, Muhammad H.; Trinh, Huong; Hsu, John T.; Ament, Katarina A.; Crocetti, Deana; Mori, Susumu; Mostofsky, Stewart H.; Yantis, Steven; Miller, Michael I.; Ratnanather, J. Tilak

    2013-01-01

    One goal of computational anatomy (CA) is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM) registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids' brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms—Freesurfer and FSL—by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations. PMID:23986653

  15. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  16. Fault tree analysis of fire and explosion accidents for dual fuel (diesel/natural gas) ship engine rooms

    NASA Astrophysics Data System (ADS)

    Guan, Yifeng; Zhao, Jie; Shi, Tengfei; Zhu, Peipei

    2016-07-01

    In recent years, China's increased interest in environmental protection has led to a promotion of energy-efficient dual fuel (diesel/natural gas) ships in Chinese inland rivers. A natural gas as ship fuel may pose dangers of fire and explosion if a gas leak occurs. If explosions or fires occur in the engine rooms of a ship, heavy damage and losses will be incurred. In this paper, a fault tree model is presented that considers both fires and explosions in a dual fuel ship; in this model, dual fuel engine rooms are the top events. All the basic events along with the minimum cut sets are obtained through the analysis. The primary factors that affect accidents involving fires and explosions are determined by calculating the degree of structure importance of the basic events. According to these results, corresponding measures are proposed to ensure and improve the safety and reliability of Chinese inland dual fuel ships.

  17. Global detailed geoid computation and model analysis

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Vincent, S.

    1974-01-01

    Comparisons and analyses were carried out through the use of detailed gravimetric geoids which we have computed by combining models with a set of 26,000 1 deg x 1 deg mean free air gravity anomalies. The accuracy of the detailed gravimetric geoid computed using the most recent Goddard earth model (GEM-6) in conjunction with the set of 1 deg x 1 deg mean free air gravity anomalies is assessed at + or - 2 meters on the continents of North America, Europe, and Australia, 2 to 5 meters in the Northeast Pacific and North Atlantic areas, and 5 to 10 meters in other areas where surface gravity data are sparse. The R.M.S. differences between this detailed geoid and the detailed geoids computed using the other satellite gravity fields in conjuction with same set of surface data range from 3 to 7 meters.

  18. Hypercube-Computer Analysis Of Electromagnetic Scattering

    NASA Technical Reports Server (NTRS)

    Patterson, J. E.; Liewer, P. C.; Calalo, R. H.; Manshadi, F.

    1990-01-01

    Capabilities of hypercube and parallel processing demonstrated. Report describes use of Mark III Hypercube computer to analyze scattering of electromagnetic waves. Purpose of study to assess utility of parallel computing in such computation-intensive problems as large-scale electromagnetic scattering. Two electromagnetic codes based on different algorithms converted to run on Mark III Hypercube. First code implements finite-difference, time-domain solution of Maxwell's curl equations. Second code is Numerical Electromagnetics Code (NEC-2) which embodies frequency-domain method and developed to analyze electromagnetic responses of antennas and other metallic structures. On Mark III Hypercube with 32 active nodes, largest lattice contains about 2,048,000 unit cells.

  19. Computational Analysis of SAXS Data Acquisition.

    PubMed

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals. PMID:26244255

  20. Tools for improving safety management in the Norwegian Fishing Fleet occupational accidents analysis period of 1998-2006.

    PubMed

    Aasjord, Halvard L

    2006-01-01

    Reporting of human accidents in the Norwegian Fishing Fleet has always been very difficult because there has been no tradition in making reports on all types of working accidents among fishermen, if the accident does not seem to be very serious or there is no economical incentive to report. Therefore reports are only written when the accidents are serious or if the fisherman is reported sick. Reports about an accident are sent to the insurance company, but another report should also be sent to the Norwegian Maritime Directorate (NMD). Comparing of data from one former insurance company and NMD shows that the real numbers of injuries or serious accidents among Norwegian fishermen could be up to two times more than the numbers reported to NMD. Special analyses of 1690 accidents from the so called PUS-database (NMD) for the period 1998-2002, show that the calculated risk was 23.6 accidents per 1000 man-years. This is quite a high risk level, and most of the accidents in the fishing fleet were rather serious. The calculated risks are highest for fishermen on board the deep sea fleet of trawlers (28.6 accidents per 1000 man-years) and also on the deep sea fleet of purse seiners (28.9 accidents per 1000 man-years). Fatal accidents over a longer period of 51.5 years from 1955 to 2006 are also roughly analysed. These data from SINTEF's own database show that the numbers of fatal accidents have been decreasing over this long period, except for the two periods 1980-84 and 1990-94 where we had some casualties with total losses of larger vessels with the loss of most of the crew, but also many others typical work accidents on smaller vessels. The total numbers of registered Norwegian fishermen and also the numbers of man-years have been drastically reduced over the 51.5 years from 1955 to 2006. The risks of fatal accidents have been very steady over time at a high level, although there has been a marked risk reduction since 1990-94. For the last 8.5-year period of January 1998

  1. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  2. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  3. Thermoelectric pump performance analysis computer code

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1973-01-01

    A computer program is presented that was used to analyze and design dual-throat electromagnetic dc conduction pumps for the 5-kwe ZrH reactor thermoelectric system. In addition to a listing of the code and corresponding identification of symbols, the bases for this analytical model are provided.

  4. Computer program performs stiffness matrix structural analysis

    NASA Technical Reports Server (NTRS)

    Bamford, R.; Batchelder, R.; Schmele, L.; Wada, B. K.

    1968-01-01

    Computer program generates the stiffness matrix for a particular type of structure from geometrical data, and performs static and normal mode analyses. It requires the structure to be modeled as a stable framework of uniform, weightless members, and joints at which loads are applied and weights are lumped.

  5. Computational and Physical Analysis of Catalytic Compounds

    NASA Astrophysics Data System (ADS)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  6. Computed Tomography Analysis of NASA BSTRA Balls

    SciTech Connect

    Perry, R L; Schneberk, D J; Thompson, R R

    2004-10-12

    Fifteen 1.25 inch BSTRA balls were scanned with the high energy computed tomography system at LLNL. This system has a resolution limit of approximately 210 microns. A threshold of 238 microns (two voxels) was used, and no anomalies at or greater than this were observed.

  7. Reliability computation using fault tree analysis

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.

    1971-01-01

    A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.

  8. Wing analysis using a transonic potential flow computational method

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  9. Computer-assisted comparison of analysis and test results in transportation experiments

    SciTech Connect

    Knight, R.D.; Ammerman, D.J.; Koski, J.A.

    1998-05-10

    As a part of its ongoing research efforts, Sandia National Laboratories` Transportation Surety Center investigates the integrity of various containment methods for hazardous materials transport, subject to anomalous structural and thermal events such as free-fall impacts, collisions, and fires in both open and confined areas. Since it is not possible to conduct field experiments for every set of possible conditions under which an actual transportation accident might occur, accurate modeling methods must be developed which will yield reliable simulations of the effects of accident events under various scenarios. This requires computer software which is capable of assimilating and processing data from experiments performed as benchmarks, as well as data obtained from numerical models that simulate the experiment. Software tools which can present all of these results in a meaningful and useful way to the analyst are a critical aspect of this process. The purpose of this work is to provide software resources on a long term basis, and to ensure that the data visualization capabilities of the Center keep pace with advancing technology. This will provide leverage for its modeling and analysis abilities in a rapidly evolving hardware/software environment.

  10. Radiological Safety Analysis Computer (RSAC) Program Version 7.0 Users’ Manual

    SciTech Connect

    Dr. Bradley J Schrader

    2009-03-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.0 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  11. Radiological Safety Analysis Computer (RSAC) Program Version 7.2 Users’ Manual

    SciTech Connect

    Dr. Bradley J Schrader

    2010-10-01

    The Radiological Safety Analysis Computer (RSAC) Program Version 7.2 (RSAC-7) is the newest version of the RSAC legacy code. It calculates the consequences of a release of radionuclides to the atmosphere. A user can generate a fission product inventory from either reactor operating history or a nuclear criticality event. RSAC-7 models the effects of high-efficiency particulate air filters or other cleanup systems and calculates the decay and ingrowth during transport through processes, facilities, and the environment. Doses are calculated for inhalation, air immersion, ground surface, ingestion, and cloud gamma pathways. RSAC-7 can be used as a tool to evaluate accident conditions in emergency response scenarios, radiological sabotage events and to evaluate safety basis accident consequences. This users’ manual contains the mathematical models and operating instructions for RSAC-7. Instructions, screens, and examples are provided to guide the user through the functions provided by RSAC-7. This program was designed for users who are familiar with radiological dose assessment methods.

  12. Simulation study of traffic accidents on a three-lane highway

    NASA Astrophysics Data System (ADS)

    Chang, Jau-Yang; Lai, Wun-Cing

    2015-07-01

    Unsuitable driving behaviors often lead to the occurrence of traffic accidents. To reduce accidents and to prolong human life, simulated investigations are highly desirable to evaluate the effect of traffic safety in terms of number of traffic accidents. In this paper, a three-lane traffic flow model is proposed to analyze the probability of the occurrence of traffic accidents on highway. We define appropriate driving rules for the forward moving and lane changing of the vehicles. Three types of vehicle accidents are designed to investigate the relationships between different driving behaviors and traffic accidents. We simulate four road driving strategies, and compute the traffic flow, velocity, lane-changing frequency and the probability of the occurrence of traffic accidents for different road driving strategies. According to the simulation and analysis, it is shown that the probability of the occurrence of traffic accidents can be reduced by using the specified road driving strategies. Additionally, we found that the occurrence of traffic accidents can be avoided when the slow vehicles are suitably constrained to move on a three-lane highway.

  13. Episode analysis of deposition of radiocesium from the Fukushima Daiichi nuclear power plant accident.

    PubMed

    Morino, Yu; Ohara, Toshimasa; Watanabe, Mirai; Hayashi, Seiji; Nishizawa, Masato

    2013-03-01

    Chemical transport models played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. However, model results could not be sufficiently evaluated because of limited observational data. We assess the model performance to simulate the deposition patterns of radiocesium ((137)Cs) by making use of airborne monitoring survey data for the first time. We conducted ten sensitivity simulations to evaluate the atmospheric model uncertainties associated with key model settings including emission data and wet deposition modules. We found that simulation using emissions estimated with a regional-scale (∼ 500 km) model better reproduced the observed (137)Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (∼ 50 km) or global-scale models. In addition, simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed (137)Cs deposition rates in high-deposition areas (≥ 10 kBq m(-2)) within 1 order of magnitude and showed that deposition of radiocesium over land occurred predominantly during 15-16, 20-23, and 30-31 March 2011. PMID:23391028

  14. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. . School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  15. Computational thermo-fluid analysis of a disk brake

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  16. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  17. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  18. Computer applications for engineering/structural analysis. Revision 1

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-12-31

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  19. Advances in Computer-Based Autoantibodies Analysis

    NASA Astrophysics Data System (ADS)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  20. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  1. Scalable Computer Performance and Analysis (Hierarchical INTegration)

    1999-09-02

    HINT is a program to measure a wide variety of scalable computer systems. It is capable of demonstrating the benefits of using more memory or processing power, and of improving communications within the system. HINT can be used for measurement of an existing system, while the associated program ANALYTIC HINT can be used to explain the measurements or as a design tool for proposed systems.

  2. Spatiotemporal Analysis for Wildlife-Vehicle Based on Accident Statistics of the County Straubing-Bogen in Lower Bavaria

    NASA Astrophysics Data System (ADS)

    Pagany, R.; Dorner, W.

    2016-06-01

    During the last years the numbers of wildlife-vehicle-collisions (WVC) in Bavaria increased considerably. Despite the statistical registration of WVC and preventive measures at areas of risk along the roads, the number of such accidents could not be contained. Using geospatial analysis on WVC data of the last five years for county Straubing-Bogen, Bavaria, a small-scale methodology was found to analyse the risk of WVC along the roads in the investigated area. Various indicators were examined, which may be related to WVC. The risk depends on the time of the day and year which shows correlations in turn to the traffic density and wildlife population. Additionally the location of the collision depends on the species and on different environmental parameters. Accidents seem to correlate with the land use left and right of the street. Land use data and current vegetation were derived from remote sensing data, providing information of the general land use, also considering the vegetation period. For this a number of hot spots was selected to identify potential dependencies between land use, vegetation and season. First results from these hotspots show, that WVCs do not only depend on land use, but may show a correlation with the vegetation period. With regard to agriculture and seasonal as well as annual changes this indicates that warnings will fail due to their static character in contrast to the dynamic situation of land use and resulting risk for WVCs. This shows that there is a demand for remote sensing data with a high spatial and temporal resolution as well as a methodology to derive WVC warnings considering land use and vegetation. With remote sensing data, it could become possible to classify land use and calculate risk levels for WVC. Additional parameters, derived from remote sensed data that could be considered are relief and crops as well as other parameters such as ponds, natural and infrastructural barriers that could be related to animal behaviour and

  3. [Accidents and acts of violence in Brazil: I--Analysis of mortality data].

    PubMed

    Jorge, M H; Gawryszewski, V P; Latorre, M do R

    1997-08-01

    External causes are an important cause of death in almost all countries. They are always the second or third in the mortality ranking, but their distribution according to type varies from country to country. Mortality due to external causes by type, gender and age, for Brazil as a whole and for state capitals specifically, is analysed. Mortality rates and the proportional mortality from 1977 to 1994 were calculated. The results showed that the number of deaths due to external causes has almost doubled from 1977 to 1994 and nowadays this is the second cause of death in Brazil. The mortality rate, in 1991, was 69.8 per 100,000 inhabitants and the highest increase was in the male rates. The male rates are almost 4.5 times greater than the female ones. The first cause of death among people from 5 to 39 years old is external causes, and the majority occur between 15 and 19 years of age (65% of the deaths by external causes). Besides the growth in itself it also seems that a shift of deaths to hower ages is occurring. Both mortality by traffic accidents and that by homicide have increased over the period from 1977 to 1994. Suicides have been stable and "other external causes" have increased slowly, especially due to falls and drowning. The mortality rates for external causes in state capitals are higher than the average for Brazil as a whole, except for some northeastern capitals. The rates for the capitals in the northern region are the highest in Brazil. In the northeastern region, only Recife, Maceió and Salvador have high rates. In the southeast, Vitória, Rio de Janeiro and S. Paulo have the highest rates in the country but Belo Horizonte's rates are declining. In the southern region all the capitals showed a growth in the rates as also in the capitals of the West-central region. The growth of mortality due to external causes type of external cause is different in these capitals. Suicide is not a public health problem in Brazil nor the state capitals. Traffic

  4. Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, D.; Huang, H.; Shen, S.; Bou-Zeid, E.

    2014-01-01

    The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident (March 2011) are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem) model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

  5. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  6. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  7. The Utility of Computer-Assisted Power Analysis Lab Instruction

    ERIC Educational Resources Information Center

    Petrocelli, John V.

    2007-01-01

    Undergraduate students (N = 47), enrolled in 2 separate psychology research methods classes, evaluated a power analysis lab demonstration and homework assignment. Students attended 1 of 2 lectures that included a basic introduction to power analysis and sample size analysis. One lecture included a demonstration of how to use a computer-based power…

  8. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  9. SCDAP/RELAP5/MOD 3.1 code manual: MATPRO, A library of materials properties for Light-Water-Reactor accident analysis. Volume 4

    SciTech Connect

    Hagrman, D.T.; Allison, C.M.; Berna, G.A.

    1995-06-01

    The SCDAP/RELAP5 code has been developed for best estimate transient simulation of light -- water-reactor coolant systems during a severe accident. The code models the coupled behavior of the reactor coolant system, the core, fission products released during a severe accident transient as well as large and small break loss of coolant accidents, operational transients such as anticipated transient without SCRAM, loss of offsite power, loss of feedwater, and loss of flow. A generic modeling approach is used that permits as much of a particular system to be modeled as necessary. Control system and secondary system components are included to permit modeling of plant controls, turbines, condensers, and secondary feedwater conditioning systems. This volume, Volume IV, describes the material properties correlations and computer subroutines (MATPRO) used by SCDAP/RELAP5. formulation of the materials properties are generally semi-empirical in nature. The materials property subroutines contained in this document are for uranium, uranium dioxide, mixed uranium-plutonium dioxide fuel, zircaloy cladding, zirconium dioxide, stainless steel, stainless steel oxide, silver-indium-cadmium alloy, cadmium, boron carbide, Inconel 718, zirconium-uranium-oxygen melts, fill gas mixtures, carbon steel, and tungsten. This document also contains descriptions of the reaction and solution rate models needed to analyze a reactor accident.

  10. Computational fluid dynamics combustion analysis evaluation

    NASA Technical Reports Server (NTRS)

    Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.

    1992-01-01

    This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.

  11. Computational analysis of scramjet dual mode operation

    NASA Technical Reports Server (NTRS)

    1985-01-01

    One critical element in the design of a Scramjet is the detailed understanding of the complex flow field in the engine during various phases of operation. One area of interest is the computation of chemically reacting flows in the vicinity of flame holders. The characteristics of a method for solving the Navier-Stokes equations with chemical reactions are proposed. Also of interest are the flame holding characteristics of simple ramp and rearward facing steps. Both of these configurations are considered candidates for Scramjet flame holders.

  12. Computer-Based Interaction Analysis with DEGREE Revisited

    ERIC Educational Resources Information Center

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  13. Computer aided analysis and optimization of mechanical system dynamics

    NASA Technical Reports Server (NTRS)

    Haug, E. J.

    1984-01-01

    The purpose is to outline a computational approach to spatial dynamics of mechanical systems that substantially enlarges the scope of consideration to include flexible bodies, feedback control, hydraulics, and related interdisciplinary effects. Design sensitivity analysis and optimization is the ultimate goal. The approach to computer generation and solution of the system dynamic equations and graphical methods for creating animations as output is outlined.

  14. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  15. Simple hobby computer-based off-gas analysis system

    SciTech Connect

    Forrest, E.H.; Jansen, N.B.; Flickinger, M.C.; Tsao, G.T.

    1981-02-01

    An Apple II computer has been adapted to monitor fermentation offgas in laboratory and pilot scale fermentors. It can calculate oxygen uptake rates, carbon dioxide evolution rates, respiratory quotient as well as initiating recalibration procedures. In this report the computer-based off-gas analysis system is described.

  16. Potential applications of computational fluid dynamics to biofluid analysis

    NASA Technical Reports Server (NTRS)

    Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.

    1988-01-01

    Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.

  17. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-07-12

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  18. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-09-19

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  19. Accident analysis for transuranic waste management alternatives in the U.S. Department of Energy waste management program

    SciTech Connect

    Nabelssi, B.; Mueller, C.; Roglans-Ribas, J.; Folga, S.; Tompkins, M.; Jackson, R.

    1995-03-01

    Preliminary accident analyses and radiological source term evaluations have been conducted for transuranic waste (TRUW) as part of the US Department of Energy (DOE) effort to manage storage, treatment, and disposal of radioactive wastes at its various sites. The approach to assessing radiological releases from facility accidents was developed in support of the Office of Environmental Management Programmatic Environmental Impact Statement (EM PEIS). The methodology developed in this work is in accordance with the latest DOE guidelines, which consider the spectrum of possible accident scenarios in the implementation of various actions evaluated in an EIS. The radiological releases from potential risk-dominant accidents in storage and treatment facilities considered in the EM PEIS TRUW alternatives are described in this paper. The results show that significant releases can be predicted for only the most severe and extremely improbable accidents sequences.

  20. Supplemental analysis of accident sequences and source terms for waste treatment and storage operations and related facilities for the US Department of Energy waste management programmatic environmental impact statement

    SciTech Connect

    Folga, S.; Mueller, C.; Nabelssi, B.; Kohout, E.; Mishima, J.

    1996-12-01

    This report presents supplemental information for the document Analysis of Accident Sequences and Source Terms at Waste Treatment, Storage, and Disposal Facilities for Waste Generated by US Department of Energy Waste Management Operations. Additional technical support information is supplied concerning treatment of transuranic waste by incineration and considering the Alternative Organic Treatment option for low-level mixed waste. The latest respirable airborne release fraction values published by the US Department of Energy for use in accident analysis have been used and are included as Appendix D, where respirable airborne release fraction is defined as the fraction of material exposed to accident stresses that could become airborne as a result of the accident. A set of dominant waste treatment processes and accident scenarios was selected for a screening-process analysis. A subset of results (release source terms) from this analysis is presented.

  1. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    SciTech Connect

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project.

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  3. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  4. Process for computing geometric perturbations for probabilistic analysis

    SciTech Connect

    Fitch, Simeon H. K.; Riha, David S.; Thacker, Ben H.

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  5. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  6. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  7. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  8. RSAC -6 Radiological Safety Analysis Computer Program

    SciTech Connect

    Schrader, Bradley J; Wenzel, Douglas Rudolph

    2001-06-01

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface and plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FGR 11 and 12.

  9. Combinatorial reliability analysis of multiprocessor computers

    SciTech Connect

    Hwang, K.; Tian-Pong Chang

    1982-12-01

    The authors propose a combinatorial method to evaluate the reliability of multiprocessor computers. Multiprocessor structures are classified as crossbar switch, time-shared buses, and multiport memories. Closed-form reliability expressions are derived via combinatorial path enumeration on the probabilistic-graph representation of a multiprocessor system. The method can analyze the reliability performance of real systems like C.mmp, Tandem 16, and Univac 1100/80. User-oriented performance levels are defined for measuring the performability of degradable multiprocessor systems. For a regularly structured multiprocessor system, it is fast and easy to use this technique for evaluating system reliability with statistically independent component reliabilities. System availability can be also evaluated by this reliability study. 6 references.

  10. Weather types and traffic accidents.

    PubMed

    Klaić, Z B

    2001-06-01

    Traffic accident data for the Zagreb area for the 1981-1982 period were analyzed to investigate possible relationships between the daily number of accidents and the weather conditions that occurred for the 5 consecutive days, starting two days before the particular day. In the statistical analysis of low accident days weather type classification developed by Poje was used. For the high accident days a detailed analyses of surface and radiosonde data were performed in order to identify possible front passages. A test for independence by contingency table confirmed that conditional probability of the day with small number of accidents is the highest, provided that one day after it "N" or "NW" weather types occur, while it is the smallest for "N1" and "Bc" types. For the remaining 4 days of the examined periods dependence was not statistically confirmed. However, northern ("N", "NE" and "NW") and anticyclonic ("Vc", "V4", "V3", "V2" and "mv") weather types predominated during 5-days intervals related to the days with small number of accidents. On the contrary, the weather types with cyclonic characteristics ("N1", "N2", "N3", "Bc", "Dol1" and "Dol"), that are generally accompanied by fronts, were the rarest. For 85% days with large number of accidents, which had not been caused by objective circumstances (such as poor visibility, damaged or slippery road etc.), at least one front passage was recorded during the 3-days period, starting one day before the day with large number of accidents. PMID:11787547

  11. [Safety assessment of a Brazilian company based on analysis of work accidents by the causal tree method].

    PubMed

    Binder, M C; Pham, D; de Almeida, I M

    1998-01-01

    We present here the results of a study of 21 work-related accidents that occurred in a Brazilian manufacturing company. The aim was to assess the safety level of the company to improve its work accident prevention policy. In the last 6 months of 1992 and 1993, all accidents resulting in 15 days' absence from work, reported for social security purposes, were analyzed using the INRS causal tree method (ADC) and a questionnaire completed on site. Potential risk factors for accidents were identified based on the specific factors highlighted by the ADC. More universal trees were also compiled for the safety assessment. Three hundred and thirty specific accident factors were recorded (mean of 15.71 per accident). This is consistent with there being multiple causes of accidents rather than the assertion of Brazilian business safety departments that accidents are due to "dangerous" or "unsafe" behavior. Introducing the idea of culpability into accidents prevents the implementation of an appropriate information feedback process, essential for effective prevention. However, the large number of accidents related to "material" (78%) and "environment" (70%) indicates that working conditions are poor. This shows that the technical risks, mostly due to unsafe machinery and equipment are not being dealt with. Seventy-five potential accident factors were identified. Of these, 35% were "organizational", a high proportion for the company studied. Improvisation occurs at all levels, particularly at the organizational level. This is thus a major determinant for entire series of, if not most, accident situations. The poor condition of equipment also plays a major role in accidents. The effects of poor equipment on safety exacerbate the organizational shortcomings. The company's safety intervention policy should improve the management of human resources (rules designating particular workers for particular workstations; instructions for the safe operation of machines and equipment

  12. Computer program for the transient analysis of radioisotope thermoelectric generators.

    NASA Technical Reports Server (NTRS)

    Eggers, P. E.; Ridihalgh, J. L.

    1972-01-01

    A computer program is described which represents a comprehensive analytical tool providing the capability for predicting the output power and temperature profile of an arbitrary radioisotope thermoelectric generator (RTG) design in the presence of time-dependent operating conditions. The approach taken involves the merging of three existing computer programs - namely, an RTG weight optimization design program, a thermoelectric analysis program, and a nodal heat-transfer computer program. A total of seven transient conditions are included in the computer program as the principal transients affecting long- and short-term performance characteristics of RTGs. This computer program is unique in that it designs an optimum RTG, generates a thermal model or analog and performs heat-transfer analysis of the RTG under user-specified transient conditions.

  13. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    SciTech Connect

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  14. Computational analysis of thresholds for magnetophosphenes

    NASA Astrophysics Data System (ADS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-10-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m-2 (-20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (-20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of electric

  15. Interactive Spectral Analysis and Computation (ISAAC)

    NASA Technical Reports Server (NTRS)

    Lytle, D. M.

    1992-01-01

    Isaac is a task in the NSO external package for IRAF. A descendant of a FORTRAN program written to analyze data from a Fourier transform spectrometer, the current implementation has been generalized sufficiently to make it useful for general spectral analysis and other one dimensional data analysis tasks. The user interface for Isaac is implemented as an interpreted mini-language containing a powerful, programmable vector calculator. Built-in commands provide much of the functionality needed to produce accurate line lists from input spectra. These built-in functions include automated spectral line finding, least squares fitting of Voigt profiles to spectral lines including equality constraints, various filters including an optimal filter construction tool, continuum fitting, and various I/O functions.

  16. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  17. Acoustic analysis of a computer cooling fan

    NASA Astrophysics Data System (ADS)

    Huang, Lixi; Wang, Jian

    2005-10-01

    Noise radiated by a typical computer cooling fan is investigated experimentally and analyzed within the framework of rotor-stator interaction noise using point source formulation. The fan is 9 cm in rotor casing diameter and its design speed is 3000 rpm. The main noise sources are found and quantified; they are (a) the inlet flow distortion caused by the sharp edges of the incomplete bellmouth due to the square outer framework, (b) the interaction of rotor blades with the downstream struts which hold the motor, and (c) the extra size of one strut carrying electrical wiring. Methods are devised to extract the rotor-strut interaction noise, (b) and (c), radiated by the component forces of drag and thrust at the leading and higher order spinning pressure modes, as well as the leading edge noise generated by (a). By re-installing the original fan rotor in various casings, the noises radiated by the three features of the original fan are separated, and details of the directivity are interpreted. It is found that the inlet flow distortion and the unequal set of four struts make about the same amount of noise. Their corrections show a potential of around 10-dB sound power reduction.

  18. Local spatial frequency analysis for computer vision

    NASA Technical Reports Server (NTRS)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  19. Early life history: A computer analysis

    NASA Astrophysics Data System (ADS)

    Bell, Peter M.

    Theoretical computer calculations, based in part on measurements of ‘young’ stars obtained with an orbiting telescope, may require a reexamination of some of the basic ideas about the composition of the earth's early atmosphere and the origin of life. According to Joel S. Levine, atmospheric geophysicist at the Langley Research Center, ‘the overwhelming majority of chemical evolution experiments since the first in 1952 may have been conducted with the wrong atmospheric mixture.’Astronomical measurements indicate that considerably more ultraviolet (UV) radiation may have been emitted by the young sun in comparison to that emitted by the present sun. Therefore, high levels of such radiation from the young sun, potentially harmful to life, would have been striking the earth at the very time life was being formed.Recent photochemical calculations by Levine and others at Langley state that at the time complex organic molecules (the precursors of living systems) were first formed from atmospheric gases the earth's atmosphere was not composed primarily of methane, ammonia, and hydrogen, as was previously supposed; instead, it was composed of carbon dioxide, nitrogen, and water vapor, all resulting from volcanic activity. The calculations indicate that both methane and ammonia were extremely short-lived and that such an atmosphere was photochemically unstable if it existed at all.

  20. Fire accident analysis modeling in support of non-reactor nuclear facilities at Sandia National Laboratories

    SciTech Connect

    Restrepo, L.F.

    1993-06-01

    The Department of Energy (DOE) requires that fire hazard analyses (FHAs) be conducted for all nuclear and new facilities, with results for the latter incorporated into Title I design. For those facilities requiring safety analysis documentation, the FHA shall be documented in the Safety Analysis Reports (SARs). This paper provides an overview of the methodologies and codes being used to support FHAs at Sandia facilities, with emphasis on SARs.

  1. Computational Fluid Dynamics Analysis of Very High Temperature Gas-Cooled Reactor Cavity Cooling System

    SciTech Connect

    Frisani, Angelo; Hassan, Yassin A; Ugaz, Victor M

    2010-11-02

    The design of passive heat removal systems is one of the main concerns for the modular very high temperature gas-cooled reactors (VHTR) vessel cavity. The reactor cavity cooling system (RCCS) is a key heat removal system during normal and off-normal conditions. The design and validation of the RCCS is necessary to demonstrate that VHTRs can survive to the postulated accidents. The computational fluid dynamics (CFD) STAR-CCM+/V3.06.006 code was used for three-dimensional system modeling and analysis of the RCCS. A CFD model was developed to analyze heat exchange in the RCCS. The model incorporates a 180-deg section resembling the VHTR RCCS experimentally reproduced in a laboratory-scale test facility at Texas A&M University. All the key features of the experimental facility were taken into account during the numerical simulations. The objective of the present work was to benchmark CFD tools against experimental data addressing the behavior of the RCCS following accident conditions. Two cooling fluids (i.e., water and air) were considered to test the capability of maintaining the RCCS concrete walls' temperature below design limits. Different temperature profiles at the reactor pressure vessel (RPV) wall obtained from the experimental facility were used as boundary conditions in the numerical analyses to simulate VHTR transient evolution during accident scenarios. Mesh convergence was achieved with an intensive parametric study of the two different cooling configurations and selected boundary conditions. To test the effect of turbulence modeling on the RCCS heat exchange, predictions using several different turbulence models and near-wall treatments were evaluated and compared. The comparison among the different turbulence models analyzed showed satisfactory agreement for the temperature distribution inside the RCCS cavity medium and at the standpipes walls. For such a complicated geometry and flow conditions, the tested turbulence models demonstrated that the

  2. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    SciTech Connect

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  3. COMPUTATIONAL FLUID DYNAMICS MODELING ANALYSIS OF COMBUSTORS

    SciTech Connect

    Mathur, M.P.; Freeman, Mark; Gera, Dinesh

    2001-11-06

    In the current fiscal year FY01, several CFD simulations were conducted to investigate the effects of moisture in biomass/coal, particle injection locations, and flow parameters on carbon burnout and NO{sub x} inside a 150 MW GEEZER industrial boiler. Various simulations were designed to predict the suitability of biomass cofiring in coal combustors, and to explore the possibility of using biomass as a reburning fuel to reduce NO{sub x}. Some additional CFD simulations were also conducted on CERF combustor to examine the combustion characteristics of pulverized coal in enriched O{sub 2}/CO{sub 2} environments. Most of the CFD models available in the literature treat particles to be point masses with uniform temperature inside the particles. This isothermal condition may not be suitable for larger biomass particles. To this end, a stand alone program was developed from the first principles to account for heat conduction from the surface of the particle to its center. It is envisaged that the recently developed non-isothermal stand alone module will be integrated with the Fluent solver during next fiscal year to accurately predict the carbon burnout from larger biomass particles. Anisotropy in heat transfer in radial and axial will be explored using different conductivities in radial and axial directions. The above models will be validated/tested on various fullscale industrial boilers. The current NO{sub x} modules will be modified to account for local CH, CH{sub 2}, and CH{sub 3} radicals chemistry, currently it is based on global chemistry. It may also be worth exploring the effect of enriched O{sub 2}/CO{sub 2} environment on carbon burnout and NO{sub x} concentration. The research objective of this study is to develop a 3-Dimensional Combustor Model for Biomass Co-firing and reburning applications using the Fluent Computational Fluid Dynamics Code.

  4. Automatic analysis of computation in biochemical reactions.

    PubMed

    Egri-Nagy, Attila; Nehaniv, Chrystopher L; Rhodes, John L; Schilstra, Maria J

    2008-01-01

    We propose a modeling and analysis method for biochemical reactions based on finite state automata. This is a completely different approach compared to traditional modeling of reactions by differential equations. Our method aims to explore the algebraic structure behind chemical reactions using automatically generated coordinate systems. In this paper we briefly summarize the underlying mathematical theory (the algebraic hierarchical decomposition theory of finite state automata) and describe how such automata can be derived from the description of chemical reaction networks. We also outline techniques for the flexible manipulation of existing models. As a real-world example we use the Krebs citric acid cycle. PMID:18606208

  5. Radiation accidents.

    PubMed

    Saenger, E L

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity. PMID:3526994

  6. Radiation accidents

    SciTech Connect

    Saenger, E.L.

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity.

  7. Computational fluid dynamic analysis of liquid rocket combustion instability

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Sankaran; Grenda, Jeffrey; Merkle, Charles L.

    1991-01-01

    The paper presents a computational analysis of liquid rocket combustion instability. Consideration is given to both a fully nonlinear unsteady calculation as well as a new CFD-based linearized stability analysis. An analytical solution for the linear stability problem in a constant area combustion chamber with uniform mean flow is developed to verify the numerical analyses.

  8. SITES-WATER RESOURCE SITE ANALYSIS COMPUTER PROGRAM, VERSION 2005

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The SITES Water Resource Site Analysis Computer program is used by USDA-NRCS and others for design and analysis of dams. The current program evolved from the DAMS2 program of the 1980’s with new features added for both functionality and ease of use. An Integrated Development Environment (IDE) was ...

  9. A Computer Program to Determine Reliability Using Analysis of Variance

    ERIC Educational Resources Information Center

    Burns, Edward

    1976-01-01

    A computer program, written in Fortran IV, is described which assesses reliability by using analysis of variance. It produces a complete analysis of variance table in addition to reliability coefficients for unadjusted and adjusted data as well as the intraclass correlation for m subjects and n items. (Author)

  10. Computer-Aided Communication Satellite System Analysis and Optimization.

    ERIC Educational Resources Information Center

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  11. Computational Aeroelastic Analysis of the Ares Launch Vehicle During Ascent

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Chwalowski, Pawel; Massey, Steven J.; Vatsa, Veer N.; Heeg, Jennifer; Wieseman, Carol D.; Mineck, Raymond E.

    2010-01-01

    This paper presents the static and dynamic computational aeroelastic (CAE) analyses of the Ares crew launch vehicle (CLV) during atmospheric ascent. The influence of launch vehicle flexibility on the static aerodynamic loading and integrated aerodynamic force and moment coefficients is discussed. The ultimate purpose of this analysis is to assess the aeroelastic stability of the launch vehicle along the ascent trajectory. A comparison of analysis results for several versions of the Ares CLV will be made. Flexible static and dynamic analyses based on rigid computational fluid dynamic (CFD) data are compared with a fully coupled aeroelastic time marching CFD analysis of the launch vehicle.

  12. Oranges and Peaches: Understanding Communication Accidents in the Reference Interview.

    ERIC Educational Resources Information Center

    Dewdney, Patricia; Michell, Gillian

    1996-01-01

    Librarians often have communication "accidents" with reference questions as initially presented. This article presents linguistic analysis of query categories, including: simple failures of hearing, accidents involving pronunciation or homophones, accidents where users repeat earlier misinterpretations to librarians, and accidents where users…

  13. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    NASA Astrophysics Data System (ADS)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  14. A Computational Discriminability Analysis on Twin Fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  15. Transportation accident response of a high-capacity truck cask for spent fuel

    SciTech Connect

    O`Connell, W.J.; Glaser, R.E.; Johnson, G.L.; Perfect, S.A.; McGuinn, E.J.; Lake, W.H.

    1995-11-01

    Two of the primary goals of this study were (i) to check the structural and thermal performance of the GA-4 cask in a broad range of accidents and (ii) to carry out a severe-accidents analysis as had been addressed in the Modal Study but now using a specific recent cask design and using current-generation computer models and capabilities. At the same time, it was desired to compare the accident performance of the Ga-4 cask to that of the generic truck cask analyzed in the Modal Study. The same range of impact and fire accidents developed in the Modal Study was adopted for this study. The accident-description data base of the Modal Study categorizes accidents into types of collisions with mobile or fixed objects, non-collision accidents, and fires. The mechanical modes of damage may be via crushing, impact, or puncture. The fire occurrences in the Modal Study data are based on truck accident statistics. The fire types are taken to be pool fires of petroleum products from fuel tanks and/or cargoes.

  16. Preliminary Accident Analysis for Construction and Operation of the Chornobyl New Safety Confinement

    SciTech Connect

    Batiy, Valeriy; Rubezhansky, Yruiy; Rudko, Vladimir; shcherbin, vladimir; Yegorov, V; Schmieman, Eric A.; Timmins, Douglas C.

    2005-08-08

    Analysis of potential exposure of personal and population during construction and exploitation of the New Safe Confinement was made. Scenarios of hazard event development were ranked. It is shown, that as a whole construction and exploitation of the NSC are in accordance with actual radiation safety norms of Ukraine.

  17. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. PMID:26360211

  18. Kinetics Parameters of VVER-1000 Core with 3 MOX Lead Test Assemblies To Be Used for Accident Analysis Codes

    SciTech Connect

    Pavlovitchev, A.M.

    2000-03-08

    The present work is a part of Joint U.S./Russian Project with Weapons-Grade Plutonium Disposition in VVER Reactor and presents the neutronics calculations of kinetics parameters of VVER-1000 core with 3 introduced MOX LTAs. MOX LTA design has been studied in [1] for two options of MOX LTA: 100% plutonium and of ''island'' type. As a result, zoning i.e. fissile plutonium enrichments in different plutonium zones, has been defined. VVER-1000 core with 3 introduced MOX LTAs of chosen design has been calculated in [2]. In present work, the neutronics data for transient analysis codes (RELAP [3]) has been obtained using the codes chain of RRC ''Kurchatov Institute'' [5] that is to be used for exploitation neutronics calculations of VVER. Nowadays the 3D assembly-by-assembly code BIPR-7A and 2D pin-by-pin code PERMAK-A, both with the neutronics constants prepared by the cell code TVS-M, are the base elements of this chain. It should be reminded that in [6] TVS-M was used only for the constants calculations of MOX FAs. In current calculations the code TVS-M has been used both for UOX and MOX fuel constants. Besides, the volume of presented information has been increased and additional explications have been included. The results for the reference uranium core [4] are presented in Chapter 2. The results for the core with 3 MOX LTAs are presented in Chapter 3. The conservatism that is connected with neutronics parameters and that must be taken into account during transient analysis calculations, is discussed in Chapter 4. The conservative parameters values are considered to be used in 1-point core kinetics models of accident analysis codes.

  19. [Possibilities and limits of computer-assisted cardiotocogram analysis].

    PubMed

    Lösche, P

    1997-01-01

    The interpretation of cardiotocograms still relies primarily on visual analysis. This form of monitoring remains labour intensive and, being dependent on the training and experience of the specialist responsible, also subject to erroneous interpretation. Computer-aided cardiotocogram analysis has, in spite of encouraging successes, still not found wide application in everyday clinical routine. To achieve this, the programming system must be easy to operate, userfriendly and reliable. A program system for fully automatic cardiotocogram analysis is envisioned which runs on standard commercially-available personal computers. A clear graphic representation of the traces also permits visual assessment on the computer screen. The system described integrates the main assessment criteria of cardiotocogram analysis which can then be extended owing to the open system architecture used in the programming. Completely new analysis algorithms have given the evaluating system the capability of fully-automatic pattern recognition of fetal heart rate signals and uterine motility. An essential requirement of computer-aided cardiotocogram analysis is thereby fulfilled. Work is now focusing on the exact classification of the various types of deceleration and an extension of the capabilities of tocogram analysis. There should be nothing to hinder integration of the system into everyday clinical routine and connect it to obstetrical databases. PMID:9381837

  20. Computer programs for analysis of geophysical data

    SciTech Connect

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.