Science.gov

Sample records for accident analysis techniques

  1. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    SciTech Connect

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural, technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective

  2. Hazard categorization and accident analysis techniques for compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports

    SciTech Connect

    1992-12-31

    The purpose of this DOE Standard is to establish guidance for facility managers and Program Secretarial Officers (PSOs) and thereby help them to comply consistently and more efficiently with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports. To this end, this guidance provides the following practical information: (1) The threshold quantities of radiological material inventory below which compliance with DOE Order 5480.23 is not required. (2) The level of effort to develop the program plan and schedule required in Section 9.b. (2) of the Order, and information for making a preliminary assessment of facility hazards. (3) A uniform methodology for hazard categorization under the Order. (4) Insight into the ''graded approach'' for SAR development, especially in hazard assessment and accident analysis techniques. Individual PSOs may develop additional guidance addressing safety requirements for facilities which fall below the threshold quantities specified in this document.

  3. Mitigative techniques and analysis of generic site conditions for ground-water contamination associated with severe accidents

    SciTech Connect

    Shafer, J.M.; Oberlander, P.L.; Skaggs, R.L.

    1984-04-01

    The purpose of this study is to evaluate the feasibility of using ground-water contaminant mitigation techniques to control radionuclide migration following a severe commercial nuclear power reactor accident. The two types of severe commercial reactor accidents investigated are: (1) containment basemat penetration of core melt debris which slowly cools and leaches radionuclides to the subsurface environment, and (2) containment basemat penetration of sump water without full penetration of the core mass. Six generic hydrogeologic site classifications are developed from an evaluation of reported data pertaining to the hydrogeologic properties of all existing and proposed commercial reactor sites. One-dimensional radionuclide transport analyses are conducted on each of the individual reactor sites to determine the generic characteristics of a radionuclide discharge to an accessible environment. Ground-water contaminant mitigation techniques that may be suitable, depending on specific site and accident conditions, for severe power plant accidents are identified and evaluated. Feasible mitigative techniques and associated constraints on feasibility are determined for each of the six hydrogeologic site classifications. The first of three case studies is conducted on a site located on the Texas Gulf Coastal Plain. Mitigative strategies are evaluated for their impact on contaminant transport and results show that the techniques evaluated significantly increased ground-water travel times. 31 references, 118 figures, 62 tables.

  4. A Look at Aircraft Accident Analysis in the Early Days: Do Early 20th Century Accident Investigation Techniques Have Any Lessons for Today?

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2007-01-01

    In the early years of powered flight, the National Advisory Committee on Aeronautics in the United States produced three reports describing a method of analysis of aircraft accidents. The first report was published in 1928; the second, which was a revision of the first, was published in 1930; and the third, which was a revision and update of the second, was published in 1936. This paper describes the contents of these reports, and compares the method of analysis proposed therein to the methods used today.

  5. Quantification and uncertainty analysis of source terms for severe accidents in light water reactors (QUASAR): Part 2, Sensitivity analysis techniques

    SciTech Connect

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-10-01

    Existing methods for sensitivity analysis are described and new techniques are proposed. These techniques are evaluated through consideration relative to the QUASAR program. Merits and limitations of the various approaches are examined by a detailed application to the Suppression Pool Aerosol Removal Code (SPARC). 17 refs., 7 figs., 12 tabs.

  6. Applying STAMP in Accident Analysis

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen

    2003-01-01

    Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.

  7. Severe accident analysis using dynamic accident progression event trees

    NASA Astrophysics Data System (ADS)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  8. Accident Tolerant Fuel Analysis

    SciTech Connect

    Curtis Smith; Heather Chichester; Jesse Johns; Melissa Teague; Michael Tonks; Robert Youngblood

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced “RISMC toolkit” that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional “accident-tolerant” (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  9. Accident tolerant fuel analysis

    SciTech Connect

    Smith, Curtis; Chichester, Heather; Johns, Jesse; Teague, Melissa; Tonks, Michael Idaho National Laboratory; Youngblood, Robert

    2014-09-01

    Safety is central to the design, licensing, operation, and economics of Nuclear Power Plants (NPPs). Consequently, the ability to better characterize and quantify safety margin holds the key to improved decision making about light water reactor design, operation, and plant life extension. A systematic approach to characterization of safety margins and the subsequent margins management options represents a vital input to the licensee and regulatory analysis and decision making that will be involved. The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management by improving economics and reliability, and sustaining safety, of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced ''RISMC toolkit'' that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho National Laboratory is performing a series of case studies that will explore methods- and tools-development issues, in addition to being of current interest in their own right. One such study is a comparative analysis of safety margins of plants using different fuel cladding types: specifically, a comparison between current-technology Zircaloy cladding and a notional ''accident-tolerant'' (e.g., SiC-based) cladding. The present report begins the process of applying capabilities that are still under development to the problem of assessing new fuel designs. The approach and lessons learned from this case study will be included in future Technical Basis Guides produced by the RISMC Pathway. These guides will be the mechanism for developing the specifications for RISMC tools and for defining how plant decision makers should propose and

  10. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  11. Techniques and Tools of NASA's Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steve J.

    2005-01-01

    The Space Shuttle Columbia accident investigation was a fusion of many disciplines into a single effort. From the recovery and reconstruction of the debris, Figure 1, to the analysis, both destructive and nondestructive, of chemical and metallurgical samples, Figure 2, a multitude of analytical techniques and tools were employed. Destructive and non-destructive testing were utilized in tandem to determine if a breach in the left wing of the Orbiter had occurred, and if so, the path of the resultant high temperature plasma flow. Nondestructive analysis included topometric scanning, laser mapping, and real-time radiography. These techniques were useful in constructing a three dimensional virtual representation of the reconstruction project, specifically the left wing leading edge reinforced carbon/carbon heat protectant panels. Similarly, they were beneficial in determining where sampling should be performed on the debris. Analytic testing included such techniques as Energy Dispersive Electron Microprobe Analysis (EMPA), Electron Spectroscopy Chemical Analysis (ESCA), and X-Ray dot mapping; these techniques related the characteristics of intermetallics deposited on the leading edge of the left wing adjacent to the location of a suspected plasma breach during reentry. The methods and results of the various analyses, along with their implications into the accident, are discussed, along with the findings and recommendations of the Columbia Accident Investigation Board. Likewise, NASA's Return To Flight efforts are highlighted.

  12. Nuclear fuel cycle facility accident analysis handbook

    SciTech Connect

    Ayer, J E; Clark, A T; Loysen, P; Ballinger, M Y; Mishima, J; Owczarski, P C; Gregory, W S; Nichols, B D

    1988-05-01

    The Accident Analysis Handbook (AAH) covers four generic facilities: fuel manufacturing, fuel reprocessing, waste storage/solidification, and spent fuel storage; and six accident types: fire, explosion, tornado, criticality, spill, and equipment failure. These are the accident types considered to make major contributions to the radiological risk from accidents in nuclear fuel cycle facility operations. The AAH will enable the user to calculate source term releases from accident scenarios manually or by computer. A major feature of the AAH is development of accident sample problems to provide input to source term analysis methods and transport computer codes. Sample problems and illustrative examples for different accident types are included in the AAH.

  13. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.

    1983-01-01

    The aircraft accident data recorded by the National Transportation and Safety Board (NTSR) for 1964-1979 were analyzed to determine what problems exist in the general aviation (GA) single pilot instrument flight rule (SPIFR) environment. A previous study conducted in 1978 for the years 1964-1975 provided a basis for comparison. This effort was generally limited to SPIFR pilot error landing phase accidents but includes some SPIFR takeoff and enroute accident analysis as well as some dual pilot IFR accident analysis for comparison. Analysis was performed for 554 accidents of which 39% (216) occurred during the years 1976-1979.

  14. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  15. Aircraft Loss-of-Control Accident Analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Foster, John V.

    2010-01-01

    Loss of control remains one of the largest contributors to fatal aircraft accidents worldwide. Aircraft loss-of-control accidents are complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents. To gain a better understanding into aircraft loss-of-control events and possible intervention strategies, this paper presents a detailed analysis of loss-of-control accident data (predominantly from Part 121), including worst case combinations of causal and contributing factors and their sequencing. Future potential risks are also considered.

  16. The use of flight test techniques in aircraft accident investigations

    NASA Technical Reports Server (NTRS)

    Parks, E. K.; Bach, R. E., Jr.; Wingrove, R. C.

    1986-01-01

    Wind shear is a serious safety hazard to commercial aviation. Low level wind shear (downburst) was the cause of the takeoff accident in New Orleans, July 9, 1982, and the landing accident in Dallas, Aug. 2, 1985. Shear layer instability is a common cause of clear air turbulence (CAT) at cruising altitudes. A number of encounters with severe CAT, in which passengers were injured, have recently occurred (Hannibal, MO, April 1981; Morton, WY, July 1982; etc.). Improved accident investigation techniques can lead to a better understanding of the nature of the wind environment associated with downbursts and CAT and to better detection and avoidance procedures. For the past several years, NASA-Ames has worked closely with the National Transportation Safety Board in the investigation of wind related accidents.

  17. Systemic accident analysis: examining the gap between research and practice.

    PubMed

    Underwood, Peter; Waterson, Patrick

    2013-06-01

    The systems approach is arguably the dominant concept within accident analysis research. Viewing accidents as a result of uncontrolled system interactions, it forms the theoretical basis of various systemic accident analysis (SAA) models and methods. Despite the proposed benefits of SAA, such as an improved description of accident causation, evidence within the scientific literature suggests that these techniques are not being used in practice and that a research-practice gap exists. The aim of this study was to explore the issues stemming from research and practice which could hinder the awareness, adoption and usage of SAA. To achieve this, semi-structured interviews were conducted with 42 safety experts from ten countries and a variety of industries, including rail, aviation and maritime. This study suggests that the research-practice gap should be closed and efforts to bridge the gap should focus on ensuring that systemic methods meet the needs of practitioners and improving the communication of SAA research. PMID:23542136

  18. An analysis of aircraft accidents involving fires

    NASA Technical Reports Server (NTRS)

    Lucha, G. V.; Robertson, M. A.; Schooley, F. A.

    1975-01-01

    All U. S. Air Carrier accidents between 1963 and 1974 were studied to assess the extent of total personnel and aircraft damage which occurred in accidents and in accidents involving fire. Published accident reports and NTSB investigators' factual backup files were the primary sources of data. Although it was frequently not possible to assess the relative extent of fire-caused damage versus impact damage using the available data, the study established upper and lower bounds for deaths and damage due specifically to fire. In 12 years there were 122 accidents which involved airframe fires. Eighty-seven percent of the fires occurred after impact, and fuel leakage from ruptured tanks or severed lines was the most frequently cited cause. A cost analysis was performed for 300 serious accidents, including 92 serious accidents which involved fire. Personal injury costs were outside the scope of the cost analysis, but data on personnel injury judgements as well as settlements received from the CAB are included for reference.

  19. A Technique for Showing Causal Arguments in Accident Reports

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, C. W.

    2005-01-01

    In the prototypical accident report, specific findings, particularly those related to causes and contributing factors, are usually written out explicitly and clearly. Also, the evidence upon which these findings are based is typically explained in detail. Often lacking, however, is any explicit discussion, description, or depiction of the arguments that connect the findings and the evidence. That is, the reports do not make clear why the investigators believe that the specific evidence they found necessarily leads to the particular findings they enumerated. This paper shows how graphical techniques can be used to depict relevant arguments supporting alternate positions on the causes of a complex road-traffic accident.

  20. Safety analysis of surface haulage accidents

    SciTech Connect

    Randolph, R.F.; Boldt, C.M.K.

    1996-12-31

    Research on improving haulage truck safety, started by the U.S. Bureau of Mines, is being continued by its successors. This paper reports the orientation of the renewed research efforts, beginning with an update on accident data analysis, the role of multiple causes in these accidents, and the search for practical methods for addressing the most important causes. Fatal haulage accidents most often involve loss of control or collisions caused by a variety of factors. Lost-time injuries most often involve sprains or strains to the back or multiple body areas, which can often be attributed to rough roads and the shocks of loading and unloading. Research to reduce these accidents includes improved warning systems, shock isolation for drivers, encouraging seatbelt usage, and general improvements to system and task design.

  1. Anthropotechnological analysis of industrial accidents in Brazil.

    PubMed Central

    Binder, M. C.; de Almeida, I. M.; Monteau, M.

    1999-01-01

    The Brazilian Ministry of Labour has been attempting to modify the norms used to analyse industrial accidents in the country. For this purpose, in 1994 it tried to make compulsory use of the causal tree approach to accident analysis, an approach developed in France during the 1970s, without having previously determined whether it is suitable for use under the industrial safety conditions that prevail in most Brazilian firms. In addition, opposition from Brazilian employers has blocked the proposed changes to the norms. The present study employed anthropotechnology to analyse experimental application of the causal tree method to work-related accidents in industrial firms in the region of Botucatu, São Paulo. Three work-related accidents were examined in three industrial firms representative of local, national and multinational companies. On the basis of the accidents analysed in this study, the rationale for the use of the causal tree method in Brazil can be summarized for each type of firm as follows: the method is redundant if there is a predominance of the type of risk whose elimination or neutralization requires adoption of conventional industrial safety measures (firm representative of local enterprises); the method is worth while if the company's specific technical risks have already largely been eliminated (firm representative of national enterprises); and the method is particularly appropriate if the firm has a good safety record and the causes of accidents are primarily related to industrial organization and management (multinational enterprise). PMID:10680249

  2. Single pilot IFR accident data analysis

    NASA Technical Reports Server (NTRS)

    Harris, D. F.; Morrisete, J. A.

    1982-01-01

    The aircraft accident data recorded and maintained by the National Transportation Safety Board for 1964 to 1979 were analyzed to determine what problems exist in the general aviation single pilot instrument flight rules environment. A previous study conducted in 1978 for the years 1964 to 1975 provided a basis for comparison. The purpose was to determine what changes, if any, have occurred in trends and cause-effect relationships reported in the earlier study. The increasing numbers have been tied to measures of activity to produce accident rates which in turn were analyzed in terms of change. Where anomalies or unusually high accident rates were encountered, further analysis was conducted to isolate pertinent patterns of cause factors and/or experience levels of involved pilots. The bulk of the effort addresses accidents in the landing phase of operations. A detailed analysis was performed on controlled/uncontrolled collisions and their unique attributes delineated. Estimates of day vs. night general aviation activity and accident rates were obtained.

  3. Accident analysis for US fast burst reactors

    SciTech Connect

    Paternoster, R.; Flanders, M.; Kazi, H.

    1994-09-01

    In the US fast burst reactor (FBR) community there has been increasing emphasis and scrutiny on safety analysis and understanding of possible accident scenarios. This paper summarizes recent work in these areas that is going on at the different US FBR sites. At this time, all of the FBR facilities have or in the process of updating and refining their accident analyses. This effort is driven by two objectives: to obtain a more realistic scenario for emergency response procedures and contingency plans, and to determine compliance with changing regulatory standards.

  4. Time-dependent accident sequence analysis

    SciTech Connect

    Chu, T.L.

    1983-01-01

    One problem of the current event tree methodology is that the transitions between accident sequences are not modeled. The causes of transitions are mostly due to operator actions during an accident. A model for such transitions is presented. A generalized algorithm is used for quantification. In the more realistic accident analysis, the progression of the physical processes, which determines the time available for proper operators response, is modeled. Furthermore, the uncertainty associated with the physical modeling is considered. As an example, the approach is applied to analyze TMI-type accidents. Statistical evidence is collected and used in assessing the frequency of stuck-open pressure operated relief valve at B and W plants as well as the frequency of misdiagnosis. Statistical data are also used in modeling the timing of operator actions during the accident. A thermal code (CUT) is developed to determine the time at which the core uncovery occurs. A response surface is used to propagate the uncertainty associated with the thermal code.

  5. Cross-analysis of hazmat road accidents using multiple databases.

    PubMed

    Trépanier, Martin; Leroux, Marie-Hélène; de Marcellis-Warin, Nathalie

    2009-11-01

    Road selection for hazardous materials transportation relies heavily on risk analysis. With risk being generally expressed as a product of the probability of occurrence and the expected consequence, one will understand that risk analysis is data intensive. However, various authors have noticed the lack of statistical reliability of hazmat accident databases due to the systematic underreporting of such events. Also, official accident databases alone are not always providing all the information required (economical impact, road conditions, etc.). In this paper, we attempt to integrate many data sources to analyze hazmat accidents in the province of Quebec, Canada. Databases on dangerous goods accidents, road accidents and work accidents were cross-analyzed. Results show that accidents can hardly be matched and that these databases suffer from underreporting. Police records seem to have better coverage than official records maintained by hazmat authorities. Serious accidents are missing from government's official databases (some involving deaths or major spills) even though their declaration is mandatory. PMID:19819367

  6. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  7. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    SciTech Connect

    CROWE, R.D.

    1999-09-09

    This document provides the detailed accident analysis to support ''HNF-3553, Spent Nuclear Fuel Project Final Safety, Analysis Report, Annex A,'' ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  8. Canister storage building design basis accident analysis documentation

    SciTech Connect

    KOPELIC, S.D.

    1999-02-25

    This document provides the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  9. Cold Vacuum Drying (CVD) Facility Design Basis Accident Analysis Documentation

    SciTech Connect

    PIEPHO, M.G.

    1999-10-20

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report, ''Cold Vacuum Drying Facility Final Safety Analysis Report (FSAR).'' All assumptions, parameters and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR.

  10. Reactor Safety Gap Evaluation of Accident Tolerant Components and Severe Accident Analysis

    SciTech Connect

    Farmer, Mitchell T.; Bunt, R.; Corradini, M.; Ellison, Paul B.; Francis, M.; Gabor, John D.; Gauntt, R.; Henry, C.; Linthicum, R.; Luangdilok, W.; Lutz, R.; Paik, C.; Plys, M.; Rabiti, Cristian; Rempe, J.; Robb, K.; Wachowiak, R.

    2015-01-31

    The overall objective of this study was to conduct a technology gap evaluation on accident tolerant components and severe accident analysis methodologies with the goal of identifying any data and/or knowledge gaps that may exist, given the current state of light water reactor (LWR) severe accident research, and additionally augmented by insights obtained from the Fukushima accident. The ultimate benefit of this activity is that the results can be used to refine the Department of Energy’s (DOE) Reactor Safety Technology (RST) research and development (R&D) program plan to address key knowledge gaps in severe accident phenomena and analyses that affect reactor safety and that are not currently being addressed by the industry or the Nuclear Regulatory Commission (NRC).

  11. [An analysis of industrial accidents in the working field with a particular emphasis on repeated accidents].

    PubMed

    Wakisaka, I; Yanagihashi, T; Tomari, T; Sato, M

    1990-03-01

    The present study is based on an analysis of routinely submitted reports of occupational accidents experienced by the workers of industrial enterprises under the jurisdiction of Kagoshima Labor Standard Office during a 5-year period 1983 to 1987. Officially notified injuries serious enough to keep employees away from their job for work at least 4 days were utilized in this study. Data was classified so as to give an observed frequency distribution for workers having any specified number of accidents. Also, the accident rate which is an indicator of the risk of accident was compared among different occupations, between age groups and between the sexes. Results obtained are as follows; 1) For the combined total of 6,324 accident cases for 8 types of occupation (Construction, Transportation, Mining & Quarrying, Forestry, Food manufacture, Lumber & Woodcraft, Manufacturing industry and Other business), the number of those who had at least one accident was 6,098, of which 5,837 were injured only once, 208 twice, 21 three times and 2 four times. When occupation type was fixed, however, the number of workers having one, two, three and four times of accidents were 5,895, 182, 19 and 2, respectively. This suggests that some workers are likely to have experienced repeated accidents in more than one type of occupation.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2131982

  12. Accident progression event tree analysis for postulated severe accidents at N Reactor

    SciTech Connect

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. ); Medford, G.T. )

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  13. TMI-2 accident: core heat-up analysis

    SciTech Connect

    Ardron, K.H.; Cain, D.G.

    1981-01-01

    This report summarizes NSAC study of reactor core thermal conditions during the accident at Three Mile Island, Unit 2. The study focuses primarily on the time period from core uncovery (approximately 113 minutes after turbine trip) through the initiation of sustained high pressure injection (after 202 minutes). The transient analysis is based upon established sequences of events; plant data; post-accident measurements; interpretation or indirect use of instrument responses to accident conditions.

  14. Criticality accident dosimetry by chromosomal analysis.

    PubMed

    Voisin, P; Roy, L; Hone, P A; Edwards, A A; Lloyd, D C; Stephan, G; Romm, H; Groer, P G; Brame, R

    2004-01-01

    The technique of measuring the frequency of dicentric chromosomal aberrations in blood lymphocytes was used to estimate doses in a simulated criticality accident. The simulation consisted of three exposures; approximately 5 Gy with a bare source and 1 and 2 Gy with a lead-shielded source. Three laboratories made separate estimates of the doses. These were made by the iterative method of apportioning the observed dicentric frequencies between the gamma and neutron components, taking account of a given gamma/neutron dose ratio, and referring the separated dicentric frequencies to dose-response calibration curves. An alternative method, based on Bayesian ideas, was employed. This was developed for interpreting dicentric frequencies in situations where the gamma/neutron ratio is uncertain. Both methods gave very similar results. One laboratory produced dose estimates close to the eventual exercise reference doses and the other laboratories estimated slightly higher values. The main reason for the higher values was the calibration relationships for fission neutrons. PMID:15353688

  15. Development of Database for Accident Analysis in Indian Mines

    NASA Astrophysics Data System (ADS)

    Tripathy, Debi Prasad; Guru Raghavendra Reddy, K.

    2015-08-01

    Mining is a hazardous industry and high accident rates associated with underground mining is a cause of deep concern. Technological developments notwithstanding, rate of fatal accidents and reportable incidents have not shown corresponding levels of decline. This paper argues that adoption of appropriate safety standards by both mine management and the government may result in appreciable reduction in accident frequency. This can be achieved by using the technology in improving the working conditions, sensitising workers and managers about causes and prevention of accidents. Inputs required for a detailed analysis of an accident include information on location, time, type, cost of accident, victim, nature of injury, personal and environmental factors etc. Such information can be generated from data available in the standard coded accident report form. This paper presents a web based application for accident analysis in Indian mines during 2001-2013. An accident database (SafeStat) prototype based on Intranet of the TCP/IP agreement, as developed by the authors, is also discussed.

  16. NASA's Accident Precursor Analysis Process and the International Space Station

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Lutomski, Michael

    2010-01-01

    This viewgraph presentation reviews the implementation of Accident Precursor Analysis (APA), as well as the evaluation of In-Flight Investigations (IFI) and Problem Reporting and Corrective Action (PRACA) data for the identification of unrecognized accident potentials on the International Space Station.

  17. Thermohydraulic and Safety Analysis for CARR Under Station Blackout Accident

    SciTech Connect

    Wenxi Tian; Suizheng Qiu; Guanghui Su; Dounan Jia; Xingmin Liu - China Institute of Atomic Energy

    2006-07-01

    A thermohydraulic and safety analysis code (TSACC) has been developed using Fortran 90 language to evaluate the transient thermohydraulic behaviors and safety characteristics of the China Advanced Research Reactor(CARR) under Station Blackout Accident(SBA). For the development of TSACC, a series of corresponding mathematical and physical models were considered. Point reactor neutron kinetics model was adopted for solving reactor power. All possible flow and heat transfer conditions under station blackout accident were considered and the optional models were supplied. The usual Finite Difference Method (FDM) was abandoned and a new model was adopted to evaluate the temperature field of core plate type fuel element. A new simple and convenient equation was proposed for the resolution of the transient behaviors of the main pump instead of the complicated four-quadrant model. Gear method and Adams method were adopted alternately for a better solution to the stiff differential equations describing the dynamic behaviors of the CARR. The computational result of TSACC showed the enough safety margin of CARR under SBA. For the purpose of Verification and Validation (V and V), the simulated results of TSACC were compared with those of Relap5/Mdo3. The V and V result indicated a good agreement between the results by the two codes. Because of the adoption of modular programming techniques, this analysis code is expected to be applied to other reactors by easily modifying the corresponding function modules. (authors)

  18. An analysis of pileup accidents in highway systems

    NASA Astrophysics Data System (ADS)

    Chang, Jau-Yang; Lai, Wun-Cing

    2016-02-01

    Pileup accident is a multi-vehicle collision occurring in the lane and producing by successive following vehicles. It is a special collision on highway. The probability of the occurrence of pileup accident is lower than that of the other accidents in highway systems. However, the pileup accident leads to injuries and damages which are often serious. In this paper, we analyze the occurrence of pileup accidents by considering the three types of dangerous collisions in highway systems. We evaluate those corresponding to rear-end collision, lane-changing collision, and double lane-changing collision. We simulate four road driving strategies to investigate the relationships between different vehicle collisions and pileup accidents. In accordance with the simulation and analysis, it is shown that the double lane-changing collisions result in an increase of the occurrence of pileup accidents. Additionally, we found that the probability of the occurrence of pileup accidents can be reduced when the speeds of vehicles are suitably constrained in highway systems.

  19. Analysis of Credible Accidents for Argonaut Reactors

    SciTech Connect

    Hawley, S. C.; Kathern, R. L.; Robkin, M. A.

    1981-04-01

    Five areas of potential accidents have been evaluated for the Argonaut-UTR reactors. They are: • insertion of excess reactivity • catastrophic rearrangement of the core • explosive chemical reaction • graphite fire • fuel-handling accident. A nuclear excursion resulting from the rapid insertion of the maximum available excess reactivity would produce only 12 MWs which is insufficient to cause fuel melting even with conservative assumptions. Although precise structural rearrangement of the core would create a potential hazard, it is simply not credible to assume that such an arrangement would result from the forces of an earthquake or other catastrophic event. Even damage to the fuel from falling debris or other objects is unlikely given the normal reactor structure. An explosion from a metal-water reaction could not occur because there is no credible source of sufficient energy to initiate the reaction. A graphite fire could conceivably create some damage to the reactor but not enough to melt any fuel or initiate a metal-water reaction. The only credible accident involving offsite doses was determined to be a fuel-handling accident which, given highly conservative assumptions, would produce a whole-body dose equivalent of 2 rem from noble gas immersion and a lifetime dose equivalent commitment to the thyroid of 43 rem from radioiodines.

  20. Speciation analysis of I-127,129 in the crop field soil contaminated by the Fukushima Dai-ichi nuclear power plant accident with newly developed chemical separation techniques

    NASA Astrophysics Data System (ADS)

    Honda, Maki; Matsuzaki, Hiroyuki; Saito, Takumi; Nagai, Hisao

    2014-05-01

    In previous study, we investigated the depth profile of the accident derived I-129 and downward migration speed in soils of near-field of Fukushima Dai-ichi Nuclear Power Plant, including crop fields and man-made fields. I-129 in soil was measured by AMS and stable iodine (I-127) was measured by ICP-MS at MALT (Micro Analysis Laboratory, Tandem accelerator), The University of Tokyo. It was found that I-129 was concentrated near surface but distributed deeper compared with Cs-137. It was also found that I-129 seems to move downward more quickly than Cs-137. To investigate the adsorption mechanism and the elemental process of migration of the accident derived I-129 in soil, it is important to know what kind of component the I-129 combines with. Recent studies on the X-ray absorption fine structure (XAFS), especially near edge structure (XANES), reported that the stable iodine (I-127) in soil existed as an organic component. However, it had not yet been proved that it was also the case with the accident derived I-129 because it had been incorporated in the soil system only recently and the abundance of I-129 in soil was more than 8 orders of magnitude smaller than sub-ppm level stable iodine (I-127). In this study a progressive sequential extraction method including the dialysis and the dynamic headspace method was newly developed to obtain only the iodine sticking to the soil organic component. The stable iodine can be quantified by direct analysis of the fraction and I-129 can be quantified by AMS method of the fraction added with carrier. The fraction of the organic component for I-127 and I-129 can be evaluated respectively by comparing with the other fraction and/or with the total concentration obtained by the bulk analysis (e.g. by the pyrohydrolysis).

  1. OFFSITE RADIOLOGICAL CONSEQUENCE ANALYSIS FOR THE BOUNDING FLAMMABLE GAS ACCIDENT

    SciTech Connect

    KRIPPS, L.J.

    2005-02-18

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a SST. The calculation applies reasonably conservative input parameters in accordance with guidance in DOE-STD-3009, Appendix A. The purpose of this analysis is to calculate the offsite radiological consequence of the bounding flammable gas accident. DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', requires the formal quantification of a limited subset of accidents representing a complete set of bounding conditions. The results of these analyses are then evaluated to determine if they challenge the DOE-STD-3009-94, Appendix A, ''Evaluation Guideline,'' of 25 rem total effective dose equivalent in order to identify and evaluate safety-class structures, systems, and components. The bounding flammable gas accident is a detonation in a single-shell tank (SST). A detonation versus a deflagration was selected for analysis because the faster flame speed of a detonation can potentially result in a larger release of respirable material. A detonation in an SST versus a double-shell tank (DST) was selected as the bounding accident because the estimated respirable release masses are the same and because the doses per unit quantity of waste inhaled are greater for SSTs than for DSTs. Appendix A contains a DST analysis for comparison purposes.

  2. Analysis of tritium mission FMEF/FAA fuel handling accidents

    SciTech Connect

    Van Keuren, J.C.

    1997-11-18

    The Fuels Material Examination Facility/Fuel Assembly Area is proposed to be used for fabrication of mixed oxide fuel to support the Fast Flux Test Facility (FFTF) tritium/medical isotope mission. The plutonium isotope mix for the new mission is different than that analyzed in the FMEF safety analysis report. A reanalysis was performed of three representative accidents for the revised plutonium mix to determine the impact on the safety analysis. Current versions computer codes and meterology data files were used for the analysis. The revised accidents were a criticality, an explosion in a glovebox, and a tornado. The analysis concluded that risk guidelines were met with the revised plutonium mix.

  3. Corporate cost of occupational accidents: an activity-based analysis.

    PubMed

    Rikhardsson, Pall M; Impgaard, Martin

    2004-03-01

    The systematic accident cost analysis (SACA) project was carried out during 2001 by The Aarhus School of Business and PricewaterhouseCoopers Denmark with financial support from The Danish National Working Environment Authority. Its focused on developing and testing a method for evaluating occupational costs of companies for use by occupational health and safety professionals. The method was tested in nine Danish companies within three different industry sectors and the costs of 27 selected occupational accidents in these companies were calculated. One of the main conclusions is that the SACA method could be used in all of the companies without revisions. The evaluation of accident cost showed that 2/3 of the costs of occupational accidents are visible in the Danish corporate accounting systems reviewed while 1/3 is hidden from management view. The highest cost of occupational accidents for a company with 3.600 employees was estimated to approximately US$ 682.000. The paper includes an introduction regarding accident cost analysis in companies, a presentation of the SACA project methodology and the SACA method itself, a short overview of some of the results of the SACA project and a conclusion. Further information about the project is available at http://www.asb.dk/saca. PMID:14642872

  4. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  5. Accident Sequence Evaluation Program: Human reliability analysis procedure

    SciTech Connect

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  6. Accident investigation: Analysis of aircraft motions from ATC radar recordings

    NASA Technical Reports Server (NTRS)

    Wingrove, R. C.

    1976-01-01

    A technique was developed for deriving time histories of an aircraft's motion from air traffic control (ATC) radar records. This technique uses the radar range and azimuth data, along with the downlinked altitude data (from an onboard Mode-C transponder), to derive an expanded set of data which includes airspeed, lift, thrust-drag, attitude angles (pitch, roll, and heading), etc. This method of analyzing aircraft motions was evaluated through flight experiments which used the CV-990 research aircraft and recordings from both the enroute and terminal ATC radar systems. The results indicate that the values derived from the ATC radar records are for the most part in good agreement with the corresponding values obtained from airborne measurements. In an actual accident, this analysis of ATC radar records can complement the flight-data recorders, now onboard airliners, and provide a source of recorded information for other types of aircraft that are equipped with Mode-C transponders but not with onboard recorders.

  7. Probabilistic methods for accident-progression analysis

    SciTech Connect

    Jamali, K. M.

    1981-01-01

    Probabilistic methods that can be used as basis for deterministic calculations of transients or accidents in nuclear power plants are described. They include obtaining initiator-dependent sequences on the component level and related analyses, propagation of primary event uncertainties in the ranking of sequences, and detailed treatment of dependent failures. The results are shown for protected transients in the short term forced circulation phase of decay heat removal in the Clinch River Breeder Reactor. Higher values of unavailabilities are obtained than previous works as a result of more detailed common cause/mode failure modeling. The unavailability of decay heat removal by forced circulation for the loss of off-site power and loss of main feedwater system initiators is estimated at 4 x 10/sup -3//yr and 9 x 10/sup -3//yr, respectively. 15 refs., 1 fig., 2 tabs.

  8. Accident analysis of the windowless target system

    SciTech Connect

    Bianchi, F.; Ferri, R.

    2006-07-01

    Transmutation systems are able to reduce the radio-toxicity and amount of High-Level Wastes (HLW), which are the main concerns related to the peaceful use of nuclear energy, and therefore they should make nuclear energy more easily acceptable by population. A transmutation system consists of a sub-critical fast reactor, an accelerator and a Target System, where the spallation reactions needed to sustain the chain reaction take place. Three options were proposed for the Target System within the European project PDS-XADS (Preliminary Design Studies on an Experimental Accelerator Driven System): window, windowless and solid. This paper describes the constraints taken into account in the design of the windowless Target System for the large Lead-Bismuth-Eutectic cooled XADS and deals with the results of the calculations performed to assess the behaviour of the target during some accident sequences related to pump trips. (authors)

  9. MELCOR accident analysis for ARIES-ACT

    SciTech Connect

    Paul W. Humrickhouse; Brad J. Merrill

    2012-08-01

    We model a loss of flow accident (LOFA) in the ARIES-ACT1 tokamak design. ARIES-ACT1 features an advanced SiC blanket with LiPb as coolant and breeder, a helium cooled steel structural ring and tungsten divertors, a thin-walled, helium cooled vacuum vessel, and a room temperature water-cooled shield outside the vacuum vessel. The water heat transfer system is designed to remove heat by natural circulation during a LOFA. The MELCOR model uses time-dependent decay heats for each component determined by 1-D modeling. The MELCOR model shows that, despite periodic boiling of the water coolant, that structures are kept adequately cool by the passive safety system.

  10. FSAR fire accident analysis for a plutonium facility

    SciTech Connect

    Lam, K.

    1997-06-01

    The Final Safety Analysis Report (FSAR) for a plutonium facility as required by DOE Orders 5480.23 and 5480.22 has recently been completed and approved. The facility processes and stores radionuclides such as Pu-238, Pu-239, enriched uranium, and to a lesser degree other actinides. This facility produces heat sources. DOE Order 5480.23 and DOE-STD-3009-94 require analysis of different types of accidents (operational accidents such as fires, explosions, spills, criticality events, and natural phenomena such as earthquakes). The accidents that were analyzed quantitatively, or the Evaluation Basis Accidents (EBAs), were selected based on a multi-step screening process that utilizes extensively the Hazards Analysis (HA) performed for the facility. In the HA, specific accident scenarios, with estimated frequency and consequences, were developed for each identified hazard associated with facility operations and activities. Analysis of the EBAs and comparison of their consequences to the evaluation guidelines established the safety envelope for the facility and identified the safety-class structures, systems, and components. This paper discusses the analysis of the fire EBA. This fire accident was analyzed in relatively great detail in the FSAR because of its potential off-site consequences are more severe compared to other events. In the following, a description of the scenario is first given, followed by a brief summary of the methodology for calculating the source term. Finally, the author discuss how a key parameter affecting the source term, the leakpath factor, was determined, which is the focus of this paper.

  11. Human factors review for Severe Accident Sequence Analysis (SASA)

    SciTech Connect

    Krois, P.A.; Haas, P.M.; Manning, J.J.; Bovell, C.R.

    1984-01-01

    The paper will discuss work being conducted during this human factors review including: (1) support of the Severe Accident Sequence Analysis (SASA) Program based on an assessment of operator actions, and (2) development of a descriptive model of operator severe accident management. Research by SASA analysts on the Browns Ferry Unit One (BF1) anticipated transient without scram (ATWS) was supported through a concurrent assessment of operator performance to demonstrate contributions to SASA analyses from human factors data and methods. A descriptive model was developed called the Function Oriented Accident Management (FOAM) model, which serves as a structure for bridging human factors, operations, and engineering expertise and which is useful for identifying needs/deficiencies in the area of accident management. The assessment of human factors issues related to ATWS required extensive coordination with SASA analysts. The analysis was consolidated primarily to six operator actions identified in the Emergency Procedure Guidelines (EPGs) as being the most critical to the accident sequence. These actions were assessed through simulator exercises, qualitative reviews, and quantitative human reliability analyses. The FOAM descriptive model assumes as a starting point that multiple operator/system failures exceed the scope of procedures and necessitates a knowledge-based emergency response by the operators. The FOAM model provides a functionally-oriented structure for assembling human factors, operations, and engineering data and expertise into operator guidance for unconventional emergency responses to mitigate severe accident progression and avoid/minimize core degradation. Operators must also respond to potential radiological release beyond plant protective barriers. Research needs in accident management and potential uses of the FOAM model are described. 11 references, 1 figure.

  12. Synthesis of quantitative and qualitative evidence for accident analysis in risk-based highway planning.

    PubMed

    Lambert, James H; Peterson, Kenneth D; Joshi, Nilesh N

    2006-09-01

    Accident analysis involves the use of both quantitative and qualitative data in decision-making. The aim of this paper is to demonstrate the synthesis of relevant quantitative and qualitative evidence for accident analysis and for planning a large and diverse portfolio of highway investment projects. The proposed analysis and visualization techniques along with traditional mathematical modeling serve as an aid to planners, engineers, and the public in comparing the benefits of current and proposed improvement projects. The analysis uses data on crash rates, average daily traffic, cost estimates from highway agency databases, and project portfolios for regions and localities. It also utilizes up to two motivations out of seven that are outlined in the Transportation Equity Act for the 21st Century (TEA-21). Three case studies demonstrate the risk-based approach to accident analysis for short- and long-range transportation plans. The approach is adaptable to other topics in accident analysis and prevention that involve the use of quantitative and qualitative evidence, risk analysis, and multi-criteria decision-making for project portfolio selection. PMID:16730627

  13. Code System for Toxic Gas Accident Analysis.

    2001-09-24

    Version 00 TOXRISK is an interactive program developed to aid in the evaluation of nuclear power plant control room habitability in the event of a nearby toxic material release. The program uses a model which is consistent with the approach described in the NRC Regulatory Guide 1.78. Release of the gas is treated as an initial puff followed by a continuous plume. The relative proportions of these as well as the plume release rate aremore » supplied by the user. Transport of the gas is modeled as a Gaussian distribution and occurs through the action of a constant velocity, constant direction wind. Dispersion or diffusion of the gas during transport is described by modified Pasquill-Gifford dispersion coefficients. Great flexibility is afforded the user in specifying the release description, meteorological conditions, relative geometry of the accident and plant, and the plant ventilation system characteristics. Two types of simulation can be performed: multiple case (parametric) studies and probabilistic analyses.« less

  14. Shipping container response to severe highway and railway accident conditions: Appendices

    SciTech Connect

    Fischer, L.E.; Chou, C.K.; Gerhard, M.A.; Kimura, C.Y.; Martin, R.W.; Mensing, R.W.; Mount, M.E.; Witte, M.C.

    1987-02-01

    Volume 2 contains the following appendices: Severe accident data; truck accident data; railroad accident data; highway survey data and bridge column properties; structural analysis; thermal analysis; probability estimation techniques; and benchmarking for computer codes used in impact analysis. (LN)

  15. Severe Accident Analysis Code SAMPSON Improvement for IMPACT Project

    NASA Astrophysics Data System (ADS)

    Ujita, Hiroshi; Ikeda, Takashi; Naitoh, Masanori

    SAMPSON is the integral code for severe accident analysis in detail with modular structure, developed in the IMPACT project. Each module can run independently and communication with multiple analysis modules supervised by the analysis control module makes an integral analysis possible. At the end of Phase 1 (1994-1997), demonstration simulations by combinations of up to 11 analysis modules had been performed and physical models in the code had been verified by separate-effect tests and validated by inegral tests. Multi-dimensional mechanistic models and theoretical-based conservation equations have been applied, during Phase 2 (1998-2000). New models for Accident Management evaluation have been also developed. Verificaton and validation have been performed by analysing separate-effect tests and inegral tests, while actual plant analyses are also being in progress.

  16. Final safety analysis report for the Galileo Mission: Volume 2, Book 2: Accident model document: Appendices

    SciTech Connect

    Not Available

    1988-12-15

    This section of the Accident Model Document (AMD) presents the appendices which describe the various analyses that have been conducted for use in the Galileo Final Safety Analysis Report II, Volume II. Included in these appendices are the approaches, techniques, conditions and assumptions used in the development of the analytical models plus the detailed results of the analyses. Also included in these appendices are summaries of the accidents and their associated probabilities and environment models taken from the Shuttle Data Book (NSTS-08116), plus summaries of the several segments of the recent GPHS safety test program. The information presented in these appendices is used in Section 3.0 of the AMD to develop the Failure/Abort Sequence Trees (FASTs) and to determine the fuel releases (source terms) resulting from the potential Space Shuttle/IUS accidents throughout the missions.

  17. INDUSTRIAL/MILITARY ACTIVITY-INITIATED ACCIDENT SCREENING ANALYSIS

    SciTech Connect

    D.A. Kalinich

    1999-09-27

    Impacts due to nearby installations and operations were determined in the Preliminary MGDS Hazards Analysis (CRWMS M&O 1996) to be potentially applicable to the proposed repository at Yucca Mountain. This determination was conservatively based on limited knowledge of the potential activities ongoing on or off the Nevada Test Site (NTS). It is intended that the Industrial/Military Activity-Initiated Accident Screening Analysis provided herein will meet the requirements of the ''Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants'' (NRC 1987) in establishing whether this external event can be screened from further consideration or must be included as a design basis event (DBE) in the development of accident scenarios for the Monitored Geologic Repository (MGR). This analysis only considers issues related to preclosure radiological safety. Issues important to waste isolation as related to impact from nearby installations will be covered in the MGR performance assessment.

  18. Cold Vacuum Drying facility design basis accident analysis documentation

    SciTech Connect

    CROWE, R.D.

    2000-08-08

    This document provides the detailed accident analysis to support HNF-3553, Annex B, Spent Nuclear Fuel Project Final Safety Analysis Report (FSAR), ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the FSAR. The calculations in this document address the design basis accidents (DBAs) selected for analysis in HNF-3553, ''Spent Nuclear Fuel Project Final Safety Analysis Report'', Annex B, ''Cold Vacuum Drying Facility Final Safety Analysis Report.'' The objective is to determine the quantity of radioactive particulate available for release at any point during processing at the Cold Vacuum Drying Facility (CVDF) and to use that quantity to determine the amount of radioactive material released during the DBAs. The radioactive material released is used to determine dose consequences to receptors at four locations, and the dose consequences are compared with the appropriate evaluation guidelines and release limits to ascertain the need for preventive and mitigative controls.

  19. Analysis of Crew Fatigue in AIA Guantanamo Bay Aviation Accident

    NASA Technical Reports Server (NTRS)

    Rosekind, Mark R.; Gregory, Kevin B.; Miller, Donna L.; Co, Elizabeth L.; Lebacqz, J. Victor; Statler, Irving C. (Technical Monitor)

    1994-01-01

    Flight operations can engender fatigue, which can affect flight crew performance, vigilance, and mood. The National Transportation Safety Board (NTSB) requested the NASA Fatigue Countermeasures Program to analyze crew fatigue factors in an aviation accident that occurred at Guantanamo Bay, Cuba. There are specific fatigue factors that can be considered in such investigations: cumulative sleep loss, continuous hours of wakefulness prior to the incident or accident, and the time of day at which the accident occurred. Data from the NTSB Human Performance Investigator's Factual Report, the Operations Group Chairman's Factual Report, and the Flight 808 Crew Statements were analyzed, using conservative estimates and averages to reconcile discrepancies among the sources. Analysis of these data determined the following: the entire crew displayed cumulative sleep loss, operated during an extended period of continuous wakefulness, and obtained sleep at times in opposition to the circadian disposition for sleep, and that the accident occurred in the afternoon window of physiological sleepiness. In addition to these findings, evidence that fatigue affected performance was suggested by the cockpit voice recorder (CVR) transcript as well as in the captain's testimony. Examples from the CVR showed degraded decision-making skills, fixation, and slowed responses, all of which can be affected by fatigue; also, the captain testified to feeling "lethargic and indifferent" just prior to the accident. Therefore, the sleep/wake history data supports the hypothesis that fatigue was a factor that affected crewmembers' performance. Furthermore, the examples from the CVR and the captain's testimony support the hypothesis that the fatigue had an impact on specific actions involved in the occurrence of the accident.

  20. GASFLOW analysis of a tritium leak accident

    SciTech Connect

    Farman, R.F.; Fujita, R.K.; Travis, J.R.

    1994-09-01

    The consequences of an earthquake-induced fire involving a tritium leak were analyzed using the GASFLOW computer code. Modeling features required by the analysis include ventilation boundary conditions, flow of a gas mixture in an enclosure containing obstacles, thermally induced buoyancy, and combustion phenomena.

  1. Luminescence techniques for dose reconstruction in accident situations: possibilites, limitations and uncertainties

    SciTech Connect

    Haskell, E.H.

    1996-12-31

    In a nuclear accident of even moderate size, locations will inevitably be exposed which do not have adequate monitoring. In these situations nontraditional dosimeters such as brick, tiles or other environmental materials have historically provided measurements against which models of transport and exposure could be tested. Given sufficient speed and accuracy, the utility of TL techniques applied to natural materials can extend well beyond model verification to a variety of dosimetric applications.

  2. Accident analysis of heavy water cooled thorium breeder reactor

    NASA Astrophysics Data System (ADS)

    Yulianti, Yanti; Su'ud, Zaki; Takaki, Naoyuki

    2015-04-01

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  3. Accident analysis of heavy water cooled thorium breeder reactor

    SciTech Connect

    Yulianti, Yanti; Su’ud, Zaki; Takaki, Naoyuki

    2015-04-16

    power reactor has a peak value before reactor has new balance condition. The analysis showed that temperatures of fuel and claddings during accident are still below limitations which are in secure condition.

  4. Analysis of PWR RCS Injection Strategy During Severe Accident

    SciTech Connect

    Wang, S.-J.; Chiang, K.-S.; Chiang, S.-C.

    2004-05-15

    Reactor coolant system (RCS) injection is an important strategy for severe accident management of a pressurized water reactor (PWR) system. Maanshan is a typical Westinghouse PWR nuclear power plant (NPP) with large, dry containment. The severe accident management guideline (SAMG) of Maanshan NPP is developed based on the Westinghouse Owners Group (WOG) SAMG.The purpose of this work is to analyze the RCS injection strategy of PWR system in an overheated core condition. Power is assumed recovered as the vessel water level drops to the bottom of active fuel. The Modular Accident Analysis Program version 4.0.4 (MAAP4) code is chosen as a tool for analysis. A postulated station blackout sequence for Maanshan NPP is cited as a reference case for this analysis. The hot leg creep rupture occurs during the mitigation action with immediate injection after power recovery according to WOG SAMG, which is not desired. This phenomenon is not considered while developing the WOG SAMG. Two other RCS injection methods are analyzed by using MAAP4. The RCS injection strategy is modified in the Maanshan SAMG. These results can be applied for typical PWR NPPs.

  5. Calculation Notes for Subsurface Leak Resulting in Pool, TWRS FSAR Accident Analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Subsurface Leaks Resulting in Pool.

  6. Calculation notes for surface leak resulting in pool, TWRS FSAR accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document includes the calculations performed to quantify the risk associated with the unmitigated and mitigated accident scenarios described in the TWRS FSAR for the accident analysis titled: Surface Leaks Resulting in Pool.

  7. NASA Accident Precursor Analysis Handbook, Version 1.0

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Everett, Chris; Hall, Anthony; Insley, Scott

    2011-01-01

    Catastrophic accidents are usually preceded by precursory events that, although observable, are not recognized as harbingers of a tragedy until after the fact. In the nuclear industry, the Three Mile Island accident was preceded by at least two events portending the potential for severe consequences from an underappreciated causal mechanism. Anomalies whose failure mechanisms were integral to the losses of Space Transportation Systems (STS) Challenger and Columbia had been occurring within the STS fleet prior to those accidents. Both the Rogers Commission Report and the Columbia Accident Investigation Board report found that processes in place at the time did not respond to the prior anomalies in a way that shed light on their true risk implications. This includes the concern that, in the words of the NASA Aerospace Safety Advisory Panel (ASAP), "no process addresses the need to update a hazard analysis when anomalies occur" At a broader level, the ASAP noted in 2007 that NASA "could better gauge the likelihood of losses by developing leading indicators, rather than continue to depend on lagging indicators". These observations suggest a need to revalidate prior assumptions and conclusions of existing safety (and reliability) analyses, as well as to consider the potential for previously unrecognized accident scenarios, when unexpected or otherwise undesired behaviors of the system are observed. This need is also discussed in NASA's system safety handbook, which advocates a view of safety assurance as driving a program to take steps that are necessary to establish and maintain a valid and credible argument for the safety of its missions. It is the premise of this handbook that making cases for safety more experience-based allows NASA to be better informed about the safety performance of its systems, and will ultimately help it to manage safety in a more effective manner. The APA process described in this handbook provides a systematic means of analyzing candidate

  8. Civil helicopter wire strike assessment study. Volume 2: Accident analysis briefs

    NASA Technical Reports Server (NTRS)

    Tuomela, C. H.; Brennan, M. F.

    1980-01-01

    A description and analysis of each of the 208 civil helicopter wire strike accidents reported to the National Transportation Safety Board (NTSB) for the ten year period 1970-1979 is given. The accident analysis briefs were based on pilot reports, FAA investigation reports, and such accident photographs as were made available. Briefs were grouped by year and, within year, by NTSB accident report number.

  9. Road Traffic Accident Analysis of Ajmer City Using Remote Sensing and GIS Technology

    NASA Astrophysics Data System (ADS)

    Bhalla, P.; Tripathi, S.; Palria, S.

    2014-12-01

    With advancement in technology, new and sophisticated models of vehicle are available and their numbers are increasing day by day. A traffic accident has multi-facet characteristics associated with it. In India 93% of crashes occur due to Human induced factor (wholly or partly). For proper traffic accident analysis use of GIS technology has become an inevitable tool. The traditional accident database is a summary spreadsheet format using codes and mileposts to denote location, type and severity of accidents. Geo-referenced accident database is location-referenced. It incorporates a GIS graphical interface with the accident information to allow for query searches on various accident attributes. Ajmer city, headquarter of Ajmer district, Rajasthan has been selected as the study area. According to Police records, 1531 accidents occur during 2009-2013. Maximum accident occurs in 2009 and the maximum death in 2013. Cars, jeeps, auto, pickup and tempo are mostly responsible for accidents and that the occurrence of accidents is mostly concentrated between 4PM to 10PM. GIS has proved to be a good tool for analyzing multifaceted nature of accidents. While road safety is a critical issue, yet it is handled in an adhoc manner. This Study is a demonstration of application of GIS for developing an efficient database on road accidents taking Ajmer City as a study. If such type of database is developed for other cities, a proper analysis of accidents can be undertaken and suitable management strategies for traffic regulation can be successfully proposed.

  10. Enhanced Accident Tolerant Fuels for LWRS - A Preliminary Systems Analysis

    SciTech Connect

    Gilles Youinou; R. Sonat Sen

    2013-09-01

    The severe accident at Fukushima Daiichi nuclear plants illustrates the need for continuous improvements through developing and implementing technologies that contribute to safe, reliable and cost-effective operation of the nuclear fleet. Development of enhanced accident tolerant fuel contributes to this effort. These fuels, in comparison with the standard zircaloy – UO2 system currently used by the LWR industry, should be designed such that they tolerate loss of active cooling in the core for a longer time period (depending on the LWR system and accident scenario) while maintaining or improving the fuel performance during normal operations, operational transients, and design-basis events. This report presents a preliminary systems analysis related to most of these concepts. The potential impacts of these innovative LWR fuels on the front-end of the fuel cycle, on the reactor operation and on the back-end of the fuel cycle are succinctly described without having the pretension of being exhaustive. Since the design of these various concepts is still a work in progress, this analysis can only be preliminary and could be updated as the designs converge on their respective final version.

  11. Comprehensive Analysis of Two Downburst-Related Aircraft Accidents

    NASA Technical Reports Server (NTRS)

    Shen, J.; Parks, E. K.; Bach, R. E.

    1996-01-01

    Although downbursts have been identified as the major cause of a number of aircraft takeoff and landing accidents, only the 1985 Dallas/Fort Worth (DFW) and the more recent (July 1994) Charlotte, North Carolina, landing accidents provided sufficient onboard recorded data to perform a comprehensive analysis of the downburst phenomenon. The first step in the present analysis was the determination of the downburst wind components. Once the wind components and their gradients were determined, the degrading effect of the wind environment on the airplane's performance was calculated. This wind-shear-induced aircraft performance degradation, sometimes called the F-factor, was broken down into two components F(sub 1) and F(sub 2), representing the effect of the horizontal wind gradient and the vertical wind velocity, respectively. In both the DFW and Charlotte cases, F(sub 1) was found to be the dominant causal factor of the accident. Next, the aircraft in the two cases were mathematically modeled using the longitudinal equations of motion and the appropriate aerodynamic parameters. Based on the aircraft model and the determined winds, the aircraft response to the recorded pilot inputs showed good agreement with the onboard recordings. Finally, various landing abort strategies were studied. It was concluded that the most acceptable landing abort strategy from both an analytical and pilot's standpoint was to hold constant nose-up pitch attitude while operating at maximum engine thrust.

  12. Analysis of Three Mile Island-Unit 2 accident

    SciTech Connect

    Not Available

    1980-03-01

    The Nuclear Safety Analysis Center (NSAC) of the Electric Power Research Institute has analyzed the Three Mile Island-2 accident. Early results of this analysis were a brief narrative summary, issued in mid-May 1979 and an initial version of this report issued later in 1979 as noted in the Foreword. The present report is a revised version of the 1979 report, containing summaries, a highly detailed sequence of events, a comparison of that sequence of events with those from other sources, 25 appendices, references and a list of abbreviations and acronyms. A matrix of equipment and system actions is included as a folded insert.

  13. Application of Electron Microscopy Techniques to the Investigation of Space Shuttle Columbia Accident

    NASA Technical Reports Server (NTRS)

    Shah, Sandeep

    2005-01-01

    This viewgraph presentation gives an overview of the investigation into the breakup of the Space Shuttle Columbia, and addresses the importance of a failure analysis strategy for the investigation of the Columbia accident. The main focus of the presentation is on the usefulness of electron microscopy for analyzing slag deposits from the tiles and reinforced carbon-carbon (RCC) wing panels of the Columbia orbiter.

  14. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  15. Traffic accident analysis using GIS: a case study of Kyrenia City

    NASA Astrophysics Data System (ADS)

    Kara, Can; Akçit, Nuhcan

    2015-06-01

    Traffic accidents are causing major deaths in urban environments, so analyzing locations of the traffic accidents and their reasons is crucial. In this manner, patterns of accidents and hotspot distribution are analyzed by using geographic information technology. Locations of the traffic accidents in the years 2011, 2012 and 2013 are combined to generate the kernel distribution map of Kyrenia City. This analysis aims to find high dense intersections and segments within the city. Additionally, spatial autocorrelation methods Local Morans I and Getis-Ord Gi are employed . The results are discussed in detail for further analysis. Finally, required changes for numerous intersections are suggested to decrease potential risks of high dense accident locations.

  16. Predicting System Accidents with Model Analysis During Hybrid Simulation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land D.; Throop, David R.

    2002-01-01

    Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.

  17. DATA ANALYSIS TECHNIQUES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  18. Extension of ship accident analysis to multiple-package shipments

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.

    1997-11-01

    Severe ship accidents and the probability of radioactive material release from spent reactor fuel casks were investigated previously. Other forms of RAM, e.g., plutonium oxide powder, may be shipped in large numbers of packagings rather than in one to a few casks. These smaller, more numerous packagings are typically placed in ISO containers for ease of handling, and several ISO containers may be placed in one of several holds of a cargo ship. In such cases, the size of a radioactive release resulting from a severe collision with another ship is determined not by the likelihood of compromising a single, robust package but by the probability that a certain fraction of 10`s or 100`s of individual packagings is compromised. The previous analysis involved a statistical estimation of the frequency of accidents which would result in damage to a cask located in one of seven cargo holds in a collision with another ship. The results were obtained in the form of probabilities (frequencies) of accidents of increasing severity and of release fractions for each level of severity. This paper describes an extension of the same general method in which the multiple packages are assumed to be compacted by an intruding ship`s bow until there is no free space in the hold. At such a point, the remaining energy of the colliding ship is assumed to be dissipated by progressively crushing the RAM packagings and the probability of a particular fraction of package failures is estimated by adaptation of the statistical method used previously. The parameters of a common, well characterized packaging, the 6M with 2R inner containment vessel, were employed as an illustrative example of this analysis method. However, the method is readily applicable to other packagings for which crush strengths have been measured or can be estimated with satisfactory confidence.

  19. Speech analysis as an index of alcohol intoxication--the Exxon Valdez accident.

    PubMed

    Brenner, M; Cash, J R

    1991-09-01

    As part of its investigation of the EXXON VALDEZ tankship accident and oil spill, the National Transportation Safety Board (NTSB) examined the master's speech for alcohol-related effects. Recorded speech samples were obtained from marine radio communications tapes. The samples were tested for four effects associated with alcohol consumption is available scientific literature: slowed speech, speech errors, misarticulation of difficult sounds ("slurring"), and audible changes in speech quality. It was found that speech immediately before and after the accident displayed large changes of the sort associated with alcohol consumption. These changes were not readily explained by fatigue, psychological stress, drug effects, or medical problems. Speech analysis appears to be a useful technique to provide secondary evidence of alcohol impairment. PMID:1930083

  20. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  1. A general approach to critical infrastructure accident consequences analysis

    NASA Astrophysics Data System (ADS)

    Bogalecka, Magda; Kołowrocki, Krzysztof; Soszyńska-Budny, Joanna

    2016-06-01

    The probabilistic general model of critical infrastructure accident consequences including the process of the models of initiating events generated by its accident, the process of environment threats and the process of environment degradation is presented.

  2. PERSPECTIVES ON A DOE CONSEQUENCE INPUTS FOR ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    , K; Jonathan Lowrie, J; David Thoman , D; Austin Keller , A

    2008-07-30

    Department of Energy (DOE) accident analysis for establishing the required control sets for nuclear facility safety applies a series of simplifying, reasonably conservative assumptions regarding inputs and methodologies for quantifying dose consequences. Most of the analytical practices are conservative, have a technical basis, and are based on regulatory precedent. However, others are judgmental and based on older understanding of phenomenology. The latter type of practices can be found in modeling hypothetical releases into the atmosphere and the subsequent exposure. Often the judgments applied are not based on current technical understanding but on work that has been superseded. The objective of this paper is to review the technical basis for the major inputs and assumptions in the quantification of consequence estimates supporting DOE accident analysis, and to identify those that could be reassessed in light of current understanding of atmospheric dispersion and radiological exposure. Inputs and assumptions of interest include: Meteorological data basis; Breathing rate; and Inhalation dose conversion factor. A simple dose calculation is provided to show the relative difference achieved by improving the technical bases.

  3. Offsite radiological consequence analysis for the bounding aircraft crash accident

    SciTech Connect

    OBERG, B.D.

    2003-03-22

    The purpose of this calculation note is to quantitatively analyze a bounding aircraft crash accident for comparison to the DOE-STD-3009-94, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'', Appendix A, Evaluation Guideline of 25 rem. The potential of aircraft impacting a facility was evaluated using the approach given in DOE-STD-3014-96, ''Accident Analysis for Aircraft Crash into Hazardous Facilities''. The following aircraft crash frequencies were determined for the Tank Farms in RPP-11736, ''Assessment Of Aircraft Crash Frequency For The Hanford Site 200 Area Tank Farms'': (1) The total aircraft crash frequency is ''extremely unlikely.'' (2) The general aviation crash frequency is ''extremely unlikely.'' (3) The helicopter crash frequency is ''beyond extremely unlikely.'' (4) For the Hanford Site 200 Areas, other aircraft type, commercial or military, each above ground facility, and any other type of underground facility is ''beyond extremely unlikely.'' As the potential of aircraft crash into the 200 Area tank farms is more frequent than ''beyond extremely unlikely,'' consequence analysis of the aircraft crash is required.

  4. Statistical analysis of sudden chemical leakage accidents reported in China between 2006 and 2011.

    PubMed

    Li, Yang; Ping, Hua; Ma, Zhi-Hong; Pan, Li-Gang

    2014-04-01

    According to the data from authoritative sources, 1,400 sudden leakage accidents occurred in China during 2006 to 2011 were investigated, in which, 666 accidents were used for statistical characteristic abstracted with no or little damage. The research results were as follows: (1) Time fluctuation: the yearly number of sudden leakage accidents is shown to be decreasing from 2006 to 2010, and a slightly increase in 2011. Sudden leakage accidents occur mainly in summer, and more than half of the accidents occur from May to September. (2) Regional distribution: the accidents are highly concentrated in the coastal area, in which accidents result from small and medium-sized enterprises more easily than that of the larger ones. (3) Pollutants: hazardous chemicals are up to 95 % of sudden leakage accidents. (4) Steps: transportation represents almost half of the accidents, followed by production, usage, storage, and discard. (5) Pollution and casualties: it is easy to cause environmental pollution and casualties. (6) Causes: more than half of the cases were caused by human factor, followed by management reason, and equipment failure. However, sudden chemical leakage may also be caused by high temperature, rain, wet road, and terrain. (7) The results of principal component analysis: five factors are extracted by the principal component analysis, including pollution, casualties, regional distribution, steps, and month. According to the analysis of the accident, the characteristics, causes, and damages of the sudden leakage accident will be investigated. Therefore, advices for prevention and rescue should be acquired. PMID:24407779

  5. Analysis of surface powered haulage accidents, January 1990--July 1996

    SciTech Connect

    Fesak, G.M.; Breland, R.M.; Spadaro, J.

    1996-12-31

    This report addresses surface haulage accidents that occurred between January 1990 and July 1996 involving haulage trucks (including over-the-road trucks), front-end-loaders, scrapers, utility trucks, water trucks, and other mobile haulage equipment. The study includes quarries, open pits and surface coal mines utilizing self-propelled mobile equipment to transport personnel, supplies, rock, overburden material, ore, mine waste, or coal for processing. A total of 4,397 accidents were considered. This report summarizes the major factors that led to the accidents and recommends accident prevention methods to reduce the frequency of these accidents.

  6. Decontamination analysis of the NUWAX-83 accident site using DECON

    SciTech Connect

    Tawil, J.J.

    1983-11-01

    This report presents an analysis of the site restoration options for the NUWAX-83 site, at which an exercise was conducted involving a simulated nuclear weapons accident. This analysis was performed using a computer program deveoped by Pacific Northwest Laboratory. The computer program, called DECON, was designed to assist personnel engaged in the planning of decontamination activities. The many features of DECON that are used in this report demonstrate its potential usefulness as a site restoration planning tool. Strategies that are analyzed with DECON include: (1) employing a Quick-Vac option, under which selected surfaces are vacuumed before they can be rained on; (2) protecting surfaces against precipitation; (3) prohibiting specific operations on selected surfaces; (4) requiring specific methods to be used on selected surfaces; (5) evaluating the trade-off between cleanup standards and decontamination costs; and (6) varying of the cleanup standards according to expected exposure to surface.

  7. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    NASA Technical Reports Server (NTRS)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  8. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  9. Summary of the SRS Severe Accident Analysis Program, 1987--1992

    SciTech Connect

    Long, T.A.; Hyder, M.L.; Britt, T.E.; Allison, D.K.; Chow, S.; Graves, R.D.; DeWald, A.B. Jr.; Monson, P.R. Jr.; Wooten, L.A.

    1992-11-01

    The Severe Accident Analysis Program (SAAP) is a program of experimental and analytical studies aimed at characterizing severe accidents that might occur in the Savannah River Site Production Reactors. The goals of the Severe Accident Analysis Program are: To develop an understanding of severe accidents in SRS reactors that is adequate to support safety documentation for these reactors, including the Safety Analysis Report (SAR), the Probabilistic Risk Assessment (PRA), and other studies evaluating the safety of reactor operation; To provide tools and bases for the evaluation of existing or proposed safety related equipment in the SRS reactors; To provide bases for the development of accident management procedures for the SRS reactors; To develop and maintain on the site a sufficient body of knowledge, including documents, computer codes, and cognizant engineers and scientists, that can be used to authoritatively resolve questions or issues related to reactor accidents. The Severe Accident Analysis Program was instituted in 1987 and has already produced a substantial amount of information, and specialized calculational tools. Products of the Severe Accident Analysis Program (listed in Section 9 of this report) have been used in the development of the Safety Analysis Report (SAR) and the Probabilistic Risk Assessment (PRA), and in the development of technical specifications for the SRS reactors. A staff of about seven people is currently involved directly in the program and in providing input on severe accidents to other SRS activities.

  10. Systemic analysis of so-called 'accidents on the level' in a multi trade company.

    PubMed

    Leclercq, S; Thouy, S

    2004-10-10

    Slips, trips and falls on the level are considered commonplace and are rarely subjected to in-depth analysis. They occur in highly varied circumstances in an occupational situation. In-depth analysis of these accidents was conducted within a company with the aim of understanding them better, to be able to discuss prevention field possibilities and priorities for the company concerned. Firstly, available data on 'accidents on the level' occurring over the last 4 years were analysed and a typology for these accidents was derived, based on individual activity at the time of the accident and accident location. The three most serious accident-causing situations were analysed in-depth from interviews with injured persons, as well as from activity observation and activity-related verbal information obtained from operatives. These most serious situations involved accidents occurring when climbing down from trucks or when walking either in surroundings outside company premises or from (to) a vehicle to (from) a work location. In-depth accident analysis and characterization of accident-causing situations as a whole enhance our understanding of the accident process and allow us to envisage priorities for action in the prevention field, in operational terms. Each accident-causing situation reveals environmental factors that in fact constitute accident factors (obstacle, stone, etc.), when the individual walks or climbs down from a truck. Analysis shows that other events are necessary for accident occurrence. For example, the individual may be subjected to a time constraint or may be preoccupied. Results obtained here, in a company integrating different trades, are discussed and compared with those referred to in the literature. Generalization of some of these results is also considered. PMID:15370848

  11. An Analysis of U.S. Civil Rotorcraft Accidents by Cost and Injury (1990-1996)

    NASA Technical Reports Server (NTRS)

    Iseler, Laura; DeMaio, Joe; Rutkowski, Michael (Technical Monitor)

    2002-01-01

    A study of rotorcraft accidents was conducted to identify safety issues and research areas that might lead to a reduction in rotorcraft accidents and fatalities. The primary source of data was summaries of National Transportation Safety Board (NTSB) accident reports. From 1990 to 1996, the NTSB documented 1396 civil rotorcraft accidents in the United States in which 491 people were killed. The rotorcraft data were compared to airline and general aviation data to determine the relative safety of rotorcraft compared to other segments of the aviation industry. In depth analysis of the rotorcraft data addressed demographics, mission, and operational factors. Rotorcraft were found to have an accident rate about ten times that of commercial airliners and about the same as that of general aviation. The likelihood that an accident would be fatal was about equal for all three classes of operation. The most dramatic division in rotorcraft accidents is between flights flown by private pilots versus professional pilots. Private pilots, flying low cost aircraft in benign environments, have accidents that are due, in large part, to their own errors. Professional pilots, in contrast, are more likely to have accidents that are a result of exacting missions or use of specialized equipment. For both groups judgement error is more likely to lead to a fatal accident than are other types of causes. Several approaches to improving the rotorcraft accident rate are recommended. These mostly address improvement in the training of new pilots and improving the safety awareness of private pilots.

  12. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  13. Linguistic methodology for the analysis of aviation accidents

    NASA Technical Reports Server (NTRS)

    Goguen, J. A.; Linde, C.

    1983-01-01

    A linguistic method for the analysis of small group discourse, was developed and the use of this method on transcripts of commercial air transpot accidents is demonstrated. The method identifies the discourse types that occur and determine their linguistic structure; it identifies significant linguistic variables based upon these structures or other linguistic concepts such as speech act and topic; it tests hypotheses that support significance and reliability of these variables; and it indicates the implications of the validated hypotheses. These implications fall into three categories: (1) to train crews to use more nearly optimal communication patterns; (2) to use linguistic variables as indices for aspects of crew performance such as attention; and (3) to provide guidelines for the design of aviation procedures and equipment, especially those that involve speech.

  14. Aircraft Accident Prevention: Loss-of-Control Analysis

    NASA Technical Reports Server (NTRS)

    Kwatny, Harry G.; Dongmo, Jean-Etienne T.; Chang, Bor-Chin; Bajpai, Guarav; Yasar, Murat; Belcastro, Christine M.

    2009-01-01

    The majority of fatal aircraft accidents are associated with loss-of-control . Yet the notion of loss-of-control is not well-defined in terms suitable for rigorous control systems analysis. Loss-of-control is generally associated with flight outside of the normal flight envelope, with nonlinear influences, and with an inability of the pilot to control the aircraft. The two primary sources of nonlinearity are the intrinsic nonlinear dynamics of the aircraft and the state and control constraints within which the aircraft must operate. In this paper we examine how these nonlinearities affect the ability to control the aircraft and how they may contribute to loss-of-control. Examples are provided using NASA s Generic Transport Model.

  15. Radionuclide Analysis on Bamboos following the Fukushima Nuclear Accident

    PubMed Central

    Higaki, Takumi; Higaki, Shogo; Hirota, Masahiro; Akita, Kae; Hasezawa, Seiichiro

    2012-01-01

    In response to contamination from the recent Fukushima nuclear accident, we conducted radionuclide analysis on bamboos sampled from six sites within a 25 to 980 km radius of the Fukushima Daiichi nuclear power plant. Maximum activity concentrations of radiocesium 134Cs and 137Cs in samples from Fukushima city, 65 km away from the Fukushima Daiichi plant, were in excess of 71 and 79 kBq/kg, dry weight (DW), respectively. In Kashiwa city, 195 km away from the Fukushima Daiichi, the sample concentrations were in excess of 3.4 and 4.3 kBq/kg DW, respectively. In Toyohashi city, 440 km away from the Fukushima Daiichi, the concentrations were below the measurable limits of up to 4.5 Bq/kg DW. In the radiocesium contaminated samples, the radiocesium activity was higher in mature and fallen leaves than in young leaves, branches and culms. PMID:22496858

  16. Data Analysis Techniques at LHC

    SciTech Connect

    Boccali, Tommaso

    2005-10-12

    A review of the recent developments on data analysis techniques for the upcoming LHC experiments is presented, with the description of early tests ('Data Challenges'), which are being performed before the start-up, to validate the overall design.

  17. Advanced accident sequence precursor analysis level 1 models

    SciTech Connect

    Sattison, M.B.; Thatcher, T.A.; Knudsen, J.K.; Schroeder, J.A.; Siu, N.O.

    1996-03-01

    INEL has been involved in the development of plant-specific Accident Sequence Precursor (ASP) models for the past two years. These models were developed for use with the SAPHIRE suite of PRA computer codes. They contained event tree/linked fault tree Level 1 risk models for the following initiating events: general transient, loss-of-offsite-power, steam generator tube rupture, small loss-of-coolant-accident, and anticipated transient without scram. Early in 1995 the ASP models were revised based on review comments from the NRC and an independent peer review. These models were released as Revision 1. The Office of Nuclear Regulatory Research has sponsored several projects at the INEL this fiscal year to further enhance the capabilities of the ASP models. Revision 2 models incorporates more detailed plant information into the models concerning plant response to station blackout conditions, information on battery life, and other unique features gleaned from an Office of Nuclear Reactor Regulation quick review of the Individual Plant Examination submittals. These models are currently being delivered to the NRC as they are completed. A related project is a feasibility study and model development of low power/shutdown (LP/SD) and external event extensions to the ASP models. This project will establish criteria for selection of LP/SD and external initiator operational events for analysis within the ASP program. Prototype models for each pertinent initiating event (loss of shutdown cooling, loss of inventory control, fire, flood, seismic, etc.) will be developed. A third project concerns development of enhancements to SAPHIRE. In relation to the ASP program, a new SAPHIRE module, GEM, was developed as a specific user interface for performing ASP evaluations. This module greatly simplifies the analysis process for determining the conditional core damage probability for a given combination of initiating events and equipment failures or degradations.

  18. Rehabilitation of soils and surface after a nuclear accident: Some techniques tried in the Chernobyl zone

    SciTech Connect

    Jouve, A.; Maubert, H.; Kutlakhamedov, Y.

    1993-12-31

    Six years after the Chernobyl accident, the major part of deposited radio nuclides remains in the 3 or 4 cm of the topsoil of abandoned fields in the chernobyl zone. The Decontaminating Vegetal Network allows a layer of few centimeters of the top soil to be removed with a turf harvester. The efficiency observed at Chernobyl was 97% for cesium-137 and strontium-90. After scraping the soil with the turf harvester, the bare soil must be covered and re-grown in order to prevent wind erosion of the sandy soil. A trial spraying of polyacrylamide on the soil was carried out. This technique seems promising. Trials of bio-decontamination of the removed turf using anaerobic degradation were also carried out. This experiment provided an opportunity to measure in real conditions the transfer of radionuclides in the Chernobyl zone.

  19. Analysis of Waste Leak and Toxic Chemical Release Accidents from Waste Feed Delivery (WFD) Diluent System

    SciTech Connect

    WILLIAMS, J.C.

    2000-09-15

    Radiological and toxicological consequences are calculated for 4 postulated accidents involving the Waste Feed Delivery (WFD) diluent addition systems. Consequences for the onsite and offsite receptor are calculated. This analysis contains technical information used to determine the accident consequences for the River Protection Project (RPP) Final Safety Analysis Report (FSAR).

  20. The accident analysis of mobile mine machinery in Indian opencast coal mines.

    PubMed

    Kumar, R; Ghosh, A K

    2014-01-01

    This paper presents the analysis of large mining machinery related accidents in Indian opencast coal mines. The trends of coal production, share of mining methods in production, machinery deployment in open cast mines, size and population of machinery, accidents due to machinery, types and causes of accidents have been analysed from the year 1995 to 2008. The scrutiny of accidents during this period reveals that most of the responsible factors are machine reversal, haul road design, human fault, operator's fault, machine fault, visibility and dump design. Considering the types of machines, namely, dumpers, excavators, dozers and loaders together the maximum number of fatal accidents has been caused by operator's faults and human faults jointly during the period from 1995 to 2008. The novel finding of this analysis is that large machines with state-of-the-art safety system did not reduce the fatal accidents in Indian opencast coal mines. PMID:23324038

  1. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    SciTech Connect

    Su'ud, Zaki; Anshari, Rio

    2012-06-06

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  2. Preliminary analysis of loss-of-coolant accident in Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Su'ud, Zaki; Anshari, Rio

    2012-06-01

    Loss-of-Coolant Accident (LOCA) in Boiling Water Reactor (BWR) especially on Fukushima Nuclear Accident will be discussed in this paper. The Tohoku earthquake triggered the shutdown of nuclear power reactors at Fukushima Nuclear Power station. Though shutdown process has been completely performed, cooling process, at much smaller level than in normal operation, is needed to remove decay heat from the reactor core until the reactor reach cold-shutdown condition. If LOCA happen at this condition, it will cause the increase of reactor fuel and other core temperatures and can lead to reactor core meltdown and exposure of radioactive material to the environment such as in the Fukushima Dai Ichi nuclear accident case. In this study numerical simulation has been performed to calculate pressure composition, water level and temperature distribution on reactor during this accident. There are two coolant regulating system that operational on reactor unit 1 at this accident, Isolation Condensers (IC) system and Safety Relief Valves (SRV) system. Average mass flow of steam to the IC system in this event is 10 kg/s and could keep reactor core from uncovered about 3,2 hours and fully uncovered in 4,7 hours later. There are two coolant regulating system at operational on reactor unit 2, Reactor Core Isolation Condenser (RCIC) System and Safety Relief Valves (SRV). Average mass flow of coolant that correspond this event is 20 kg/s and could keep reactor core from uncovered about 73 hours and fully uncovered in 75 hours later. There are three coolant regulating system at operational on reactor unit 3, Reactor Core Isolation Condenser (RCIC) system, High Pressure Coolant Injection (HPCI) system and Safety Relief Valves (SRV). Average mass flow of water that correspond this event is 15 kg/s and could keep reactor core from uncovered about 37 hours and fully uncovered in 40 hours later.

  3. Otorhinolaryngologic disorders and diving accidents: an analysis of 306 divers.

    PubMed

    Klingmann, Christoph; Praetorius, Mark; Baumann, Ingo; Plinkert, Peter K

    2007-10-01

    Diving is a very popular leisure activity with an increasing number of participants. As more than 80% of the diving related problems involve the head and neck region, every otorhinolaryngologist should be familiar with diving medical standards. We here present an analysis of more than 300 patients we have treated in the past four years. Between January 2002 and October 2005, 306 patients presented in our department with otorhinological disorders after diving, or after diving accidents. We collected the following data: name, sex, age, date of treatment, date of accident, diagnosis, special aspects of the diagnosis, number of dives, diving certification, whether and which surgery had been performed, history of acute diving accidents or follow up treatment, assessment of fitness to dive and special remarks. The study setting was a retrospective cohort study. The distribution of the disorders was as follows: 24 divers (8%) with external ear disorders, 140 divers (46%) with middle ear disorders, 56 divers (18%) with inner ear disorders, 53 divers (17%) with disorders of the nose and sinuses, 24 divers (8%) with decompression illness (DCI) and 9 divers (3%) who complained of various symptoms. Only 18% of the divers presented with acute disorders. The most common disorder (24%) was Eustachian tube dysfunction. Female divers were significantly more often affected. Chronic sinusitis was found to be associated with a significantly higher number of performed dives. Conservative treatment failed in 30% of the patients but sinus surgery relieved symptoms in all patients of this group. The middle ear is the main problem area for divers. Middle ear ventilation problems due to Eustachian tube dysfunction can be treated conservatively with excellent results whereas pathology of the tympanic membrane and ossicular chain often require surgery. More than four out of five patients visited our department to re-establish their fitness to dive. Although the treatment of acute diving

  4. MACCS usage at Rocky Flats Plant for consequence analysis of postulated accidents

    SciTech Connect

    Foppe, T.L.; Peterson, V.L.

    1993-10-01

    The MELCOR Accident Consequence Code System (MACCS) has been applied to the radiological consequence assessment of potential accidents from a non-reactor nuclear facility. MACCS has been used in a variety of applications to evaluate radiological dose and health effects to the public from postulated plutonium releases and from postulated criticalities. These applications were conducted to support deterministic and probabilistic accident analyses for safety analyses for safety analysis reports, radiological sabotage studies, and other regulatory requests.

  5. GPHS-RTG launch accident analysis for Galileo and Ulysses

    SciTech Connect

    Bradshaw, C.T. )

    1991-01-01

    This paper presents the safety program conducted to determine the response of the General Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) to potential launch accidents of the Space Shuttle for the Galileo and Ulysses missions. The National Aeronautics and Space Administration (NASA) provided definition of the Shuttle potential accidents and characterized the environments. The Launch Accident Scenario Evaluation Program (LASEP) was developed by GE to analyze the RTG response to these accidents. RTG detailed response to Solid Rocket Booster (SRB) fragment impacts, as well as to other types of impact, was obtained from an extensive series of hydrocode analyses. A comprehensive test program was conducted also to determine RTG response to the accident environments. The hydrocode response analyses coupled with the test data base provided the broad range response capability which was implemented in LASEP.

  6. An analysis of three weather-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Fujita, T. T.; Caracena, F.

    1977-01-01

    Two aircraft accidents in 1975, one at John F. Kennedy International Airport in New York City on 24 June and the other at Stapleton International Airport in Denver on 7 August, were examined in detail. A third accident on 23 June 1976 at Philadelphia International Airport is being investigated. Amazingly, there was a spearhead echo just to the north of each accident site. The echoes formed from 5 to 50 min in advance of the accident and moved faster than other echoes in the vicinity. These echoes were photographed by National Weather Service radars, 130-205 km away. At closer ranges, however, one or more circular echoes were depicted by airborne and ground radars. These cells were only 3-5 km in diameter, but they were accompanied by downdrafts of extreme intensity, called downbursts. All accidents occurred as aircraft, either descending or climbing, lost altitude while experiencing strong wind shear inside downburst cells.

  7. A POTENTIAL APPLICATION OF UNCERTAINTY ANALYSIS TO DOE-STD-3009-94 ACCIDENT ANALYSIS

    SciTech Connect

    Palmrose, D E; Yang, J M

    2007-05-10

    The objective of this paper is to assess proposed transuranic waste accident analysis guidance and recent software improvements in a Windows-OS version of MACCS2 that allows the inputting of parameter uncertainty. With this guidance and code capability, there is the potential to perform a quantitative uncertainty assessment of unmitigated accident releases with respect to the 25 rem Evaluation Guideline (EG) of DOE-STD-3009-94 CN3 (STD-3009). Historically, the classification of safety systems in a U.S. Department of Energy (DOE) nuclear facility's safety basis has involved how subject matter experts qualitatively view uncertainty in the STD-3009 Appendix A accident analysis methodology. Specifically, whether consequence uncertainty could be larger than previously evaluated so the site-specific accident consequences may challenge the EG. This paper assesses whether a potential uncertainty capability for MACCS2 could provide a stronger technical basis as to when the consequences from a design basis accident (DBA) truly challenges the 25 rem EG.

  8. Progress in accident analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Latkowski, J F; Gomez del Rio, J; Sanz, J

    2000-10-11

    The present work continues our effort to perform an integrated safety analysis for the HYLIFE-II inertial fusion energy (IFE) power plant design. Recently we developed a base case for a severe accident scenario in order to calculate accident doses for HYLIFE-II. It consisted of a total loss of coolant accident (LOCA) in which all the liquid flibe (Li{sub 2}BeF{sub 4}) was lost at the beginning of the accident. Results showed that the off-site dose was below the limit given by the DOE Fusion Safety Standards for public protection in case of accident, and that his dose was dominated by the tritium released during the accident.

  9. Analysis of Convair 990 rejected-takeoff accident with emphasis on decision making, training and procedures

    NASA Technical Reports Server (NTRS)

    Batthauer, Byron E.

    1987-01-01

    This paper analyzes a NASA Convair 990 (CV-990) accident with emphasis on rejected-takeoff (RTO) decision making, training, procedures, and accident statistics. The NASA Aircraft Accident Investigation Board was somewhat perplexed that an aircraft could be destroyed as a result of blown tires during the takeoff roll. To provide a better understanding of tire failure RTO's, The Board obtained accident reports, Federal Aviation Administration (FAA) studies, and other pertinent information related to the elements of this accident. This material enhanced the analysis process and convinced the Accident Board that high-speed RTO's in transport aircraft should be given more emphasis during pilot training. Pilots should be made aware of various RTO situations and statistics with emphasis on failed-tire RTO's. This background information could enhance the split-second decision-making process that is required prior to initiating an RTO.

  10. A dose-reconstruction study of the 1997 Sarov criticality accident using animated dosimetry techniques.

    PubMed

    Vazquez, Justin A; Ding, Aiping; Haley, Thomas; Caracappa, Peter F; Xu, X George

    2014-05-01

    Most computational human phantoms are static, representing a standing individual. There are, however, cases when these phantoms fail to represent accurately the detailed effects on dose that result from considering varying human posture and even whole sequences of motion. In this study, the feasibility of a dynamic and deformable phantom is demonstrated with the development of the Computational Human for Animated Dosimetry (CHAD) phantom. Based on modifications to the limb structure of the previously developed RPI Adult Male, CHAD's posture is adjustable using an optical motion capture system that records real-life human movement. To demonstrate its ability to produce dose results that reflect the changes brought about by posture-deformation, CHAD is employed to perform a dose-reconstruction analysis of the 1997 Sarov criticality accident, and a simulated total body dose of 13.3 Gy is observed, with the total body dose rate dropping from 1.4 Gy s to 0.25 Gy s over the first 4 s of retreat time. Additionally, dose measurements are calculated for individual organs and body regions, including a 36.8-Gy dose to the breast tissue, a 3.8-Gy dose to the bladder, and a 31.1-Gy dose to the thyroid, as well as the changes in dose rates for the individual organs over the course of the accident sequence. Comparison of results obtained using CHAD in an animated dosimetry simulation with reported information on dose and the medical outcome of the case shows that the consideration of posture and movement in dosimetry simulation allows for more detailed and precise analysis of dosimetry information, consideration of the evolution of the dose profile over time in the course of a given scenario, and a better understanding of the physiological impacts of radiation exposure for a given set of circumstances. PMID:24670906

  11. Analysis of construction accidents in Turkey and responsible parties.

    PubMed

    Gürcanli, G Emre; Müngen, Uğur

    2013-01-01

    Construction is one of the world's biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972-2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00-12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  12. Analysis of Construction Accidents in Turkey and Responsible Parties

    PubMed Central

    GÜRCANLI, G. Emre; MÜNGEN, Uğur

    2013-01-01

    Construction is one of the world’s biggest industry that includes jobs as diverse as building, civil engineering, demolition, renovation, repair and maintenance. Construction workers are exposed to a wide variety of hazards. This study analyzes 1,117 expert witness reports which were submitted to criminal and labour courts. These reports are from all regions of the country and cover the period 1972–2008. Accidents were classified by the consequence of the incident, time and main causes of the accident, construction type, occupation of the victim, activity at time of the accident and party responsible for the accident. Falls (54.1%), struck by thrown/falling object (12.9%), structural collapses (9.9%) and electrocutions (7.5%) rank first four places. The accidents were most likely between the hours 15:00 and 17:00 (22.6%), 10:00–12:00 (18.7%) and just after the lunchtime (9.9%). Additionally, the most common accidents were further divided into sub-types. Expert-witness assessments were used to identify the parties at fault and what acts of negligence typically lead to accidents. Nearly two thirds of the faulty and negligent acts are carried out by the employers and employees are responsible for almost one third of all cases. PMID:24077446

  13. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  14. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  15. A method for modeling and analysis of directed weighted accident causation network (DWACN)

    NASA Astrophysics Data System (ADS)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Ding, Jing

    2015-11-01

    Using complex network theory to analyze accidents is effective to understand the causes of accidents in complex systems. In this paper, a novel method is proposed to establish directed weighted accident causation network (DWACN) for the Rail Accident Investigation Branch (RAIB) in the UK, which is based on complex network and using event chains of accidents. DWACN is composed of 109 nodes which denote causal factors and 260 directed weighted edges which represent complex interrelationships among factors. The statistical properties of directed weighted complex network are applied to reveal the critical factors, the key event chains and the important classes in DWACN. Analysis results demonstrate that DWACN has characteristics of small-world networks with short average path length and high weighted clustering coefficient, and display the properties of scale-free networks captured by that the cumulative degree distribution follows an exponential function. This modeling and analysis method can assist us to discover the latent rules of accidents and feature of faults propagation to reduce accidents. This paper is further development on the research of accident analysis methods using complex network.

  16. Analysis of Loss-of-Coolant Accidents in the NBSR

    SciTech Connect

    Baek J. S.; Cheng L.; Diamond, D.

    2014-05-23

    This report documents calculations of the fuel cladding temperature during loss-of-coolant accidents in the NBSR. The probability of a pipe failure is small and procedures exist to minimize the loss of water and assure emergency cooling water flows into the reactor core during such an event. Analysis in the past has shown that the emergency cooling water would provide adequate cooling if the water filled the flow channels within the fuel elements. The present analysis is to determine if there is adequate cooling if the water drains from the flow channels. Based on photographs of how the emergency water flows into the fuel elements from the distribution pan, it can be assumed that this water does not distribute uniformly across the flow channels but rather results in a liquid film flowing downward on the inside of one of the side plates in each fuel element and only wets the edges of the fuel plates. An analysis of guillotine breaks shows the cladding temperature remains below the blister temperature in fuel plates in the upper section of the fuel element. In the lower section, the fuel plates are also cooled by water outside the element that is present due to the hold-up pan and temperatures are lower than in the upper section. For small breaks, the simulation results show that the fuel elements are always cooled on the outside even in the upper section and the cladding temperature cannot be higher than the blister temperature. The above results are predicated on assumptions that are examined in the study to see their influence on fuel temperature.

  17. Offsite Radiological Consequence Analysis for the Bounding Flammable Gas Accident

    SciTech Connect

    CARRO, C.A.

    2003-07-30

    This document quantifies the offsite radiological consequences of the bounding flammable gas accident for comparison with the 25 rem Evaluation Guideline established in DOE-STD-3009, Appendix A. The bounding flammable gas accident is a detonation in a single-shell tank The calculation applies reasonably conservation input parameters in accordance with DOE-STD-3009, Appendix A, guidance. Revision 1 incorporates comments received from Office of River Protection.

  18. Accidents at work and costs analysis: a field study in a large Italian company.

    PubMed

    Battaglia, Massimo; Frey, Marco; Passetti, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  19. Accidents at Work and Costs Analysis: A Field Study in a Large Italian Company

    PubMed Central

    BATTAGLIA, Massimo; FREY, Marco; PASSETTI, Emilio

    2014-01-01

    Accidents at work are still a heavy burden in social and economic terms, and action to improve health and safety standards at work offers great potential gains not only to employers, but also to individuals and society as a whole. However, companies often are not interested to measure the costs of accidents even if cost information may facilitate preventive occupational health and safety management initiatives. The field study, carried out in a large Italian company, illustrates technical and organisational aspects associated with the implementation of an accident costs analysis tool. The results indicate that the implementation (and the use) of the tool requires a considerable commitment by the company, that accident costs analysis should serve to reinforce the importance of health and safety prevention and that the economic dimension of accidents is substantial. The study also suggests practical ways to facilitate the implementation and the moral acceptance of the accounting technology. PMID:24869894

  20. Risk analysis of emergent water pollution accidents based on a Bayesian Network.

    PubMed

    Tang, Caihong; Yi, Yujun; Yang, Zhifeng; Sun, Jie

    2016-01-01

    To guarantee the security of water quality in water transfer channels, especially in open channels, analysis of potential emergent pollution sources in the water transfer process is critical. It is also indispensable for forewarnings and protection from emergent pollution accidents. Bridges above open channels with large amounts of truck traffic are the main locations where emergent accidents could occur. A Bayesian Network model, which consists of six root nodes and three middle layer nodes, was developed in this paper, and was employed to identify the possibility of potential pollution risk. Dianbei Bridge is reviewed as a typical bridge on an open channel of the Middle Route of the South to North Water Transfer Project where emergent traffic accidents could occur. Risk of water pollutions caused by leakage of pollutants into water is focused in this study. The risk for potential traffic accidents at the Dianbei Bridge implies a risk for water pollution in the canal. Based on survey data, statistical analysis, and domain specialist knowledge, a Bayesian Network model was established. The human factor of emergent accidents has been considered in this model. Additionally, this model has been employed to describe the probability of accidents and the risk level. The sensitive reasons for pollution accidents have been deduced. The case has also been simulated that sensitive factors are in a state of most likely to lead to accidents. PMID:26433361

  1. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    SciTech Connect

    Kohout, E.F.; Folga, S.; Mueller, C.; Nabelssi, B.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure will allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.

  2. Safety analysis results for cryostat ingress accidents in ITER

    SciTech Connect

    Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

    1997-06-01

    Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits. 6 refs., 2 figs., 1 tab.

  3. Safety analysis results for cryostat ingress accidents in ITER

    SciTech Connect

    Merrill, B.J.; Cadwallader, L.C.; Petti, D.A.

    1996-12-31

    Accidents involving the ingress of air or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

  4. Safety Analysis Results for Cryostat Ingress Accidents in ITER

    NASA Astrophysics Data System (ADS)

    Merrill, B. J.; Cadwallader, L. C.; Petti, D. A.

    1997-06-01

    Accidents involving the ingress of air, helium, or water into the cryostat of the International Thermonuclear Experimental Reactor (ITER) tokamak design have been analyzed with a modified version of the MELCOR code for the ITER Non-site Specific Safety Report (NSSR-1). The air ingress accident is the result of a postulated breach of the cryostat boundary into an adjoining room. MELCOR results for this accident demonstrate that the condensed air mass and increased heat loads are not a magnet safety concern, but that the partial vacuum in the adjoining room must be accommodated in the building design. The water ingress accident is the result of a postulated magnet arc that results in melting of a Primary Heat Transport System (PHTS) coolant pipe, discharging PHTS water and PHTS water activated corrosion products and HTO into the cryostat. MELCOR results for this accident demonstrate that the condensed water mass and increased heat loads are not a magnet safety concern, that the cryostat pressure remains below design limits, and that the corrosion product and HTO releases are well within the ITER release limits.

  5. Action Plan for updated Chapter 15 Accident Analysis in the SRS Production Reactor SAR

    SciTech Connect

    Hightower, N.T. III; Burnett, T.W.

    1989-11-15

    This report describes the Action Plan for the upgrade of the Chapter 15 Accident Analysis in the SRS Production Reactor SAR required for K-Restart. This Action Plan will be updated periodically to reflect task accomplishments and issue resolutions.

  6. School sports accidents: analysis of causes, modes, and frequencies.

    PubMed

    Kelm, J; Ahlhelm, F; Pape, D; Pitsch, W; Engel, C

    2001-01-01

    About 5% of all school children are seriously injured during physical education every year. Because of its influence on children's attitude toward sports and the economic aspects, an evaluation of causes and medical consequences is necessary. In this study, 213 school sports accidents were investigated. Besides diagnosis, the localization of injuries, as well as the duration of the sick leave were documented. Average age of injured students was 13 years. Most of the injured students blamed themselves for the accident. The most common injuries were sprains, contusions, and fractures. Main reasons for the accidents were faults in basic motion training. Playing soccer and basketball were the most frequent reasons for injuries. The upper extremity was more frequently involved than the lower extremity. Sports physicians and teachers should work out a program outlining the individual needs and capabilities of the injured students to reintegrate them into physical education. PMID:11242243

  7. Structural Analysis for the American Airlines Flight 587 Accident Investigation: Global Analysis

    NASA Technical Reports Server (NTRS)

    Young, Richard D.; Lovejoy, Andrew E.; Hilburger, Mark W.; Moore, David F.

    2005-01-01

    NASA Langley Research Center (LaRC) supported the National Transportation Safety Board (NTSB) in the American Airlines Flight 587 accident investigation due to LaRC's expertise in high-fidelity structural analysis and testing of composite structures and materials. A Global Analysis Team from LaRC reviewed the manufacturer s design and certification procedures, developed finite element models and conducted structural analyses, and participated jointly with the NTSB and Airbus in subcomponent tests conducted at Airbus in Hamburg, Germany. The Global Analysis Team identified no significant or obvious deficiencies in the Airbus certification and design methods. Analysis results from the LaRC team indicated that the most-likely failure scenario was failure initiation at the right rear main attachment fitting (lug), followed by an unstable progression of failure of all fin-to-fuselage attachments and separation of the VTP from the aircraft. Additionally, analysis results indicated that failure initiates at the final observed maximum fin loading condition in the accident, when the VTP was subjected to loads that were at minimum 1.92 times the design limit load condition for certification. For certification, the VTP is only required to support loads of 1.5 times design limit load without catastrophic failure. The maximum loading during the accident was shown to significantly exceed the certification requirement. Thus, the structure appeared to perform in a manner consistent with its design and certification, and failure is attributed to VTP loads greater than expected.

  8. Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence

    NASA Technical Reports Server (NTRS)

    Phimister, James R. (Editor); Bier, Vicki M. (Editor); Kunreuther, Howard C. (Editor)

    2004-01-01

    Almost every year there is at least one technological disaster that highlights the challenge of managing technological risk. On February 1, 2003, the space shuttle Columbia and her crew were lost during reentry into the atmosphere. In the summer of 2003, there was a blackout that left millions of people in the northeast United States without electricity. Forensic analyses, congressional hearings, investigations by scientific boards and panels, and journalistic and academic research have yielded a wealth of information about the events that led up to each disaster, and questions have arisen. Why were the events that led to the accident not recognized as harbingers? Why were risk-reducing steps not taken? This line of questioning is based on the assumption that signals before an accident can and should be recognized. To examine the validity of this assumption, the National Academy of Engineering (NAE) undertook the Accident Precursors Project in February 2003. The project was overseen by a committee of experts from the safety and risk-sciences communities. Rather than examining a single accident or incident, the committee decided to investigate how different organizations anticipate and assess the likelihood of accidents from accident precursors. The project culminated in a workshop held in Washington, D.C., in July 2003. This report includes the papers presented at the workshop, as well as findings and recommendations based on the workshop results and committee discussions. The papers describe precursor strategies in aviation, the chemical industry, health care, nuclear power and security operations. In addition to current practices, they also address some areas for future research.

  9. DYNAMIC ANALYSIS OF HANFORD UNIRRADIATED FUEL PACKAGE SUBJECTED TO SEQUENTIAL LATERAL LOADS IN HYPOTHETICAL ACCIDENT CONDITIONS

    SciTech Connect

    Wu, T

    2008-04-30

    Large fuel casks present challenges when evaluating their performance in the Hypothetical Accident Conditions (HAC) specified in the Code of Federal Regulations Title 10 part 71 (10CFR71). Testing is often limited by cost, difficulty in preparing test units and the limited availability of facilities which can carry out such tests. In the past, many casks were evaluated without testing by using simplified analytical methods. This paper presents a numerical technique for evaluating the dynamic responses of large fuel casks subjected to sequential HAC loading. A nonlinear dynamic analysis was performed for a Hanford Unirradiated Fuel Package (HUFP) [1] to evaluate the cumulative damage after the hypothetical accident Conditions of a 30-foot lateral drop followed by a 40-inch lateral puncture as specified in 10CFR71. The structural integrity of the containment vessel is justified based on the analytical results in comparison with the stress criteria, specified in the ASME Code, Section III, Appendix F [2], for Level D service loads. The analyzed cumulative damages caused by the sequential loading of a 30-foot lateral drop and a 40-inch lateral puncture are compared with the package test data. The analytical results are in good agreement with the test results.

  10. ACCIDENT ANALYSES & CONTROL OPTIONS IN SUPPORT OF THE SLUDGE WATER SYSTEM SAFETY ANALYSIS

    SciTech Connect

    WILLIAMS, J.C.

    2003-11-15

    This report documents the accident analyses and nuclear safety control options for use in Revision 7 of HNF-SD-WM-SAR-062, ''K Basins Safety Analysis Report'' and Revision 4 of HNF-SD-SNF-TSR-001, ''Technical Safety Requirements - 100 KE and 100 KW Fuel Storage Basins''. These documents will define the authorization basis for Sludge Water System (SWS) operations. This report follows the guidance of DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', for calculating onsite and offsite consequences. The accident analysis summary is shown in Table ES-1 below. While this document describes and discusses potential control options to either mitigate or prevent the accidents discussed herein, it should be made clear that the final control selection for any accident is determined and presented in HNF-SD-WM-SAR-062.

  11. Model building techniques for analysis.

    SciTech Connect

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  12. Techniques for Automated Performance Analysis

    SciTech Connect

    Marcus, Ryan C.

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  13. IAEA Activities in the Area of Safety Analysis and Accident Management

    SciTech Connect

    Lee, S.; El-Shanawany, M.

    2006-07-01

    Safety analysis is a means of demonstrating how critical safety functions, the integrity of barriers against the release of radioactive materials, and various other safety requirements are fulfilled for a broad range of operating conditions and initiating events. Accordingly, performing safety analysis for a nuclear power plant is one of the most important safety principles. Thermal-hydraulic computer codes are extensively used worldwide for safety analysis by utilities, regulatory authorities, power plant designers and vendors, nuclear fuel companies, research organizations, and technical support organizations. Safety analysis methodology and computer codes have seen a significant development over the last two decades. This fact is also reflected in the work of the International Atomic Energy Agency (IAEA) that aims at increasing the quality and international harmonization of the approaches used in safety analysis. The paper provides an overview of activities and of examples of results obtained recently or planned in the near future in the IAEA's Division of Nuclear Installation Safety in the field of safety analysis for both design basis accidents and beyond design basis accidents as well as accident management. In this paper, specific technical guidance on the safety assessments in the IAEA Safety Standards such as safety analysis methodologies, probabilistic safety assessment, and development of accident management programmes are described. Future trends and related activities in safety analysis and accident management are also introduced. (authors)

  14. Accident Analysis for the NIST Research Reactor Before and After Fuel Conversion

    SciTech Connect

    Baek J.; Diamond D.; Cuadra, A.; Hanson, A.L.; Cheng, L-Y.; Brown, N.R.

    2012-09-30

    Postulated accidents have been analyzed for the 20 MW D2O-moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The analysis has been carried out for the present core, which contains high enriched uranium (HEU) fuel and for a proposed equilibrium core with low enriched uranium (LEU) fuel. The analyses employ state-of-the-art calculational methods. Three-dimensional Monte Carlo neutron transport calculations were performed with the MCNPX code to determine homogenized fuel compositions in the lower and upper halves of each fuel element and to determine the resulting neutronic properties of the core. The accident analysis employed a model of the primary loop with the RELAP5 code. The model includes the primary pumps, shutdown pumps outlet valves, heat exchanger, fuel elements, and flow channels for both the six inner and twenty-four outer fuel elements. Evaluations were performed for the following accidents: (1) control rod withdrawal startup accident, (2) maximum reactivity insertion accident, (3) loss-of-flow accident resulting from loss of electrical power with an assumption of failure of shutdown cooling pumps, (4) loss-of-flow accident resulting from a primary pump seizure, and (5) loss-of-flow accident resulting from inadvertent throttling of a flow control valve. In addition, natural circulation cooling at low power operation was analyzed. The analysis shows that the conversion will not lead to significant changes in the safety analysis and the calculated minimum critical heat flux ratio and maximum clad temperature assure that there is adequate margin to fuel failure.

  15. Accident sequence precursor analysis level 2/3 model development

    SciTech Connect

    Lui, C.H.; Galyean, W.J.; Brownson, D.A.

    1997-02-01

    The US Nuclear Regulatory Commission`s Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models.

  16. Advanced accident sequence precursor analysis level 2 models

    SciTech Connect

    Galyean, W.J.; Brownson, D.A.; Rempe, J.L.

    1996-03-01

    The U.S. Nuclear Regulatory Commission Accident Sequence Precursor program pursues the ultimate objective of performing risk significant evaluations on operational events (precursors) occurring in commercial nuclear power plants. To achieve this objective, the Office of Nuclear Regulatory Research is supporting the development of simple probabilistic risk assessment models for all commercial nuclear power plants (NPP) in the U.S. Presently, only simple Level 1 plant models have been developed which estimate core damage frequencies. In order to provide a true risk perspective, the consequences associated with postulated core damage accidents also need to be considered. With the objective of performing risk evaluations in an integrated and consistent manner, a linked event tree approach which propagates the front end results to back end was developed. This approach utilizes simple plant models that analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude and timing of a radioactive release to the environment, and calculate the consequences for a given release. Detailed models and results from previous studies, such as the NUREG-1150 study, are used to quantify these simple models. These simple models are then linked to the existing Level 1 models, and are evaluated using the SAPHIRE code. To demonstrate the approach, prototypic models have been developed for a boiling water reactor, Peach Bottom, and a pressurized water reactor, Zion.

  17. Approaches to accident analysis in recent US Department of Energy environmental impact statements

    SciTech Connect

    Mueller, C.; Folga, S.; Nabelssi, B.

    1996-12-31

    A review of accident analyses in recent US Department of Energy (DOE) Environmental Impact Statements (EISs) was conducted to evaluate the consistency among approaches and to compare these approaches with existing DOE guidance. The review considered several components of an accident analysis: the overall scope, which in turn should reflect the scope of the EIS; the spectrum of accidents considered; the methods and assumptions used to determine frequencies or frequency ranges for the accident sequences; and the assumption and technical bases for developing radiological and chemical atmospheric source terms and for calculating the consequences of airborne releases. The review also considered the range of results generated with respect to impacts on various worker and general populations. In this paper, the findings of these reviews are presented and methods recommended for improving consistency among EISs and bringing them more into line with existing DOE guidance.

  18. The role of mitochondrial proteomic analysis in radiological accidents and terrorism.

    PubMed

    Maguire, David; Zhang, Bingrong; Zhang, Amy; Zhang, Lurong; Okunieff, Paul

    2013-01-01

    In the wake of the 9/11 terrorist attacks and the recent Level 7 nuclear event at the Fukushima Daiichi plant, there has been heightened awareness of the possibility of radiological terrorism and accidents and the need for techniques to estimate radiation levels after such events. A number of approaches to monitoring radiation using biological markers have been published, including physical techniques, cytogenetic approaches, and direct, DNA-analysis approaches. Each approach has the potential to provide information that may be applied to the triage of an exposed population, but problems with development and application of devices or lengthy analyses limit their potential for widespread application. We present a post-irradiation observation with the potential for development into a rapid point-of-care device. Using simple mitochondrial proteomic analysis, we investigated irradiated and nonirradiated murine mitochondria and identified a protein mobility shift occurring at 2-3 Gy. We discuss the implications of this finding both in terms of possible mechanisms and potential applications in bio-radiation monitoring. PMID:22879026

  19. Sensitivity analysis technique for application to deterministic models

    SciTech Connect

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method.

  20. BESAFE II: Accident safety analysis code for MFE reactor designs

    NASA Astrophysics Data System (ADS)

    Sevigny, Lawrence Michael

    The viability of controlled thermonuclear fusion as an alternative energy source hinges on its desirability from an economic and an environmental and safety standpoint. It is the latter which is the focus of this thesis. For magnetic fusion energy (MFE) devices, the safety concerns equate to a design's behavior during a worst-case accident scenario which is the loss of coolant accident (LOCA). In this dissertation, we examine the behavior of MFE devices during a LOCA and how this behavior relates to the safety characteristics of the machine; in particular the acute, whole-body, early dose. In doing so, we have produced an accident safety code, BESAFE II, now available to the fusion reactor design community. The Appendix constitutes the User's Manual for BESAFE II. The theory behind early dose calculations including the mobilization of activation products is presented in Chapter 2. Since mobilization of activation products is a strong function of temperature, it becomes necessary to calculate the thermal response of a design during a LOCA in order to determine the fraction of the activation products which are mobilized and thus become the source for the dose. The code BESAFE II is designed to determine the temperature history of each region of a design and determine the resulting mobilization of activation products at each point in time during the LOCA. The BESAFE II methodology is discussed in Chapter 4, followed by demonstrations of its use for two reference design cases: a PCA-Li tokamak and a SiC-He tokamak. Of these two cases, it is shown that the SiC-He tokamak is a better design from an accident safety standpoint than the PCA-Li tokamak. It is also found that doses derived from temperature-dependent mobilization data are different than those predicted using set mobilization categories such as those that involve Piet fractions. This demonstrates the need for more experimental data on fusion materials. The possibility for future improvements and modifications

  1. Injury patterns of seniors in traffic accidents: A technical and medical analysis

    PubMed Central

    Brand, Stephan; Otte, Dietmar; Mueller, Christian Walter; Petri, Maximilian; Haas, Philipp; Stuebig, Timo; Krettek, Christian; Haasper, Carl

    2012-01-01

    AIM: To investigate the actual injury situation of seniors in traffic accidents and to evaluate the different injury patterns. METHODS: Injury data, environmental circumstances and crash circumstances of accidents were collected shortly after the accident event at the scene. With these data, a technical and medical analysis was performed, including Injury Severity Score, Abbreviated Injury Scale and Maximum Abbreviated Injury Scale. The method of data collection is named the German In-Depth Accident Study and can be seen as representative. RESULTS: A total of 4430 injured seniors in traffic accidents were evaluated. The incidence of sustaining severe injuries to extremities, head and maxillofacial region was significantly higher in the group of elderly people compared to a younger age (P < 0.05). The number of accident-related injuries was higher in the group of seniors compared to other groups. CONCLUSION: Seniors are more likely to be involved in traffic injuries and to sustain serious to severe injuries compared to other groups. PMID:23173111

  2. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery

    PubMed Central

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005–2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689

  3. Analysis of occupational accidents: prevention through the use of additional technical safety measures for machinery.

    PubMed

    Dźwiarek, Marek; Latała, Agata

    2016-01-01

    This article presents an analysis of results of 1035 serious and 341 minor accidents recorded by Poland's National Labour Inspectorate (PIP) in 2005-2011, in view of their prevention by means of additional safety measures applied by machinery users. Since the analysis aimed at formulating principles for the application of technical safety measures, the analysed accidents should bear additional attributes: the type of machine operation, technical safety measures and the type of events causing injuries. The analysis proved that the executed tasks and injury-causing events were closely connected and there was a relation between casualty events and technical safety measures. In the case of tasks consisting of manual feeding and collecting materials, the injuries usually occur because of the rotating motion of tools or crushing due to a closing motion. Numerous accidents also happened in the course of supporting actions, like removing pollutants, correcting material position, cleaning, etc. PMID:26652689

  4. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    PubMed Central

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  5. [Comparative analysis of the radionuclide composition in fallout after the Chernobyl and the Fukushima accidents].

    PubMed

    Kotenko, K V; Shinkarev, S M; Abramov, Iu V; Granovskaia, E O; Iatsenko, V N; Gavrilin, Iu I; Margulis, U Ia; Garetskaia, O S; Imanaka, T; Khoshi, M

    2012-01-01

    The nuclear accident occurred at Fukushima Dai-ichi Nuclear Power Plant (NPP) (March 11, 2011) similarly to the accident at the Chernobyl NPP (April 26, 1986) is related to the level 7 of the INES. It is of interest to make an analysis of the radionuclide composition of the fallout following the both accidents. The results of the spectrometric measurements were used in that comparative analysis. Two areas following the Chernobyl accident were considered: (1) the near zone of the fallout - the Belarusian part of the central spot extended up to 60 km around the Chernobyl NPS and (2) the far zone of the fallout--the "Gomel-Mogilev" spot centered 200 km to the north-northeast of the damaged reactor. In the case of Fukushima accident the near zone up to about 60 km considered. The comparative analysis has been done with respect to refractory radionuclides (95Zr, 95Nb, 141Ce, 144Ce), as well as to the intermediate and volatile radionuclides 103Ru, 106Ru, 131I, 134Cs, 137Cs, 140La, 140Ba and the results of such a comparison have been discussed. With respect to exposure to the public the most important radionuclides are 131I and 137Cs. For the both accidents the ratios of 131I/137Cs in the considered soil samples are in the similar ranges: (3-50) for the Chernobyl samples and (5-70) for the Fukushima samples. Similarly to the Chernobyl accident a clear tendency that the ratio of 131I/137Cs in the fallout decreases with the increase of the ground deposition density of 137Cs within the trace related to a radioactive cloud has been identified for the Fukushima accident. It looks like this is a universal tendency for the ratio of 131I/137Cs versus the 137Cs ground deposition density in the fallout along the trace of a radioactive cloud as a result of a heavy accident at the NPP with radionuclides releases into the environment. This tendency is important for an objective reconstruction of 131I fallout based on the results of 137Cs measurements of soil samples carried out at

  6. Application of Latin hypercube sampling to RADTRAN 4 truck accident risk sensitivity analysis

    SciTech Connect

    Mills, G.S.; Neuhauser, K.S.; Kanipe, F.L.

    1994-12-31

    The sensitivity of calculated dose estimates to various RADTRAN 4 inputs is an available output for incident-free analysis because the defining equations are linear and sensitivity to each variable can be calculated in closed mathematical form. However, the necessary linearity is not characteristic of the equations used in calculation of accident dose risk, making a similar tabulation of sensitivity for RADTRAN 4 accident analysis impossible. Therefore, a study of sensitivity of accident risk results to variation of input parameters was performed using representative routes, isotopic inventories, and packagings. It was determined that, of the approximately two dozen RADTRAN 4 input parameters pertinent to accident analysis, only a subset of five or six has significant influence on typical analyses or is subject to random uncertainties. These five or six variables were selected as candidates for Latin Hypercube Sampling applications. To make the effect of input uncertainties on calculated accident risk more explicit, distributions and limits were determined for two variables which had approximately proportional effects on calculated doses: Pasquill Category probability (PSPROB) and link population density (LPOPD). These distributions and limits were used as input parameters to Sandia`s Latin Hypercube Sampling code to generate 50 sets of RADTRAN 4 input parameters used together with point estimates of other necessary inputs to calculate 50 observations of estimated accident dose risk.Tabulations of the RADTRAN 4 accident risk input variables and their influence on output plus illustrative examples of the LHS calculations, for truck transport situations that are typical of past experience, will be presented .

  7. MELCOR code analysis of a severe accident LOCA at Peach Bottom Plant

    SciTech Connect

    Carbajo, J.J. )

    1993-01-01

    A design-basis loss-of-coolant accident (LOCA) concurrent with complete loss of the emergency core cooling systems (ECCSs) has been analyzed for the Peach Bottom atomic station unit 2 using the MELCOR code, version 1.8.1. The purpose of this analysis is to calculate best-estimate times for the important events of this accident sequence and best-estimate source terms. Calculated pressures and temperatures at the beginning of the transient have been compared to results from the Peach Bottom final safety analysis report (FSAR). MELCOR-calculated source terms have been compared to source terms reported in the NUREG-1465 draft.

  8. Fission product transport analysis in a loss of decay heat removal accident at Browns Ferry

    SciTech Connect

    Wichner, R.P.; Weber, C.F.; Hodge, S.A.; Beahm, E.C.; Wright, A.L.

    1984-01-01

    This paper summarizes an analysis of the movement of noble gases, iodine, and cesium fission products within the Mark-I containment BWR reactor system represented by Browns Ferry Unit 1 during a postulated accident sequence initiated by a loss of decay heat removal (DHR) capability following a scram. The event analysis showed that this accident could be brought under control by various means, but the sequence with no operator action ultimately leads to containment (drywell) failure followed by loss of water from the reactor vessel, core degradation due to overheating, and reactor vessel failure with attendant movement of core debris onto the drywell floor.

  9. Space Shuttle Columbia Post-Accident Analysis and Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    Although the loss of the Space Shuttle Columbia and its crew was tragic, the circumstances offered a unique opportunity to examine a multitude of components which had experienced one of the harshest environments ever encountered by engineered materials: a break up at a velocity in excess of Mach 18 and an altitude exceeding 200,000 feet (63 KM), resulting in a debris field 645 miles/l,038 KM long and 10 miles/16 KM wide. Various analytical tools were employed to ascertain the sequence of events leading to the disintegration of the Orbiter and to characterize the features of the debris. The testing and analyses all indicated that a breach in a left wing reinforced carbon/carbon composite leading edge panel was the access point for hot gasses generated during re-entry to penetrate the structure of the vehicle and compromise the integrity of the materials and components in that area of the Shuttle. The analytical and elemental testing utilized such techniques as X-Ray Diffraction (XRD), Energy Dispersive X-Ray (EDX) dot mapping, Electron Micro Probe Analysis (EMPA), and X-Ray Photoelectron Spectroscopy (XPS) to characterize the deposition of intermetallics adjacent to the suspected location of the plasma breach in the leading edge of the left wing, Fig. 1.

  10. RELAP5 Application to Accident Analysis of the NIST Research Reactor

    SciTech Connect

    Baek, J.; Cuadra Gascon, A.; Cheng, L.Y.; Diamond, D.

    2012-03-18

    Detailed safety analyses have been performed for the 20 MW D{sub 2}O moderated research reactor (NBSR) at the National Institute of Standards and Technology (NIST). The time-dependent analysis of the primary system is determined with a RELAP5 transient analysis model that includes the reactor vessel, the pump, heat exchanger, fuel element geometry, and flow channels for both the six inner and twenty-four outer fuel elements. A post-processing of the simulation results has been conducted to evaluate minimum critical heat flux ratio (CHFR) using the Sudo-Kaminaga correlation. Evaluations are performed for the following accidents: (1) the control rod withdrawal startup accident and (2) the maximum reactivity insertion accident. In both cases the RELAP5 results indicate that there is adequate margin to CHF and no damage to the fuel will occur because of sufficient coolant flow through the fuel channels and the negative scram reactivity insertion.

  11. Construction of a technique plan repository and evaluation system based on AHP group decision-making for emergency treatment and disposal in chemical pollution accidents.

    PubMed

    Shi, Shenggang; Cao, Jingcan; Feng, Li; Liang, Wenyan; Zhang, Liqiu

    2014-07-15

    The environmental pollution resulting from chemical accidents has caused increasingly serious concerns. Therefore, it is very important to be able to determine in advance the appropriate emergency treatment and disposal technology for different types of chemical accidents. However, the formulation of an emergency plan for chemical pollution accidents is considerably difficult due to the substantial uncertainty and complexity of such accidents. This paper explains how the event tree method was used to create 54 different scenarios for chemical pollution accidents, based on the polluted medium, dangerous characteristics and properties of chemicals involved. For each type of chemical accident, feasible emergency treatment and disposal technology schemes were established, considering the areas of pollution source control, pollutant non-proliferation, contaminant elimination and waste disposal. Meanwhile, in order to obtain the optimum emergency disposal technology schemes as soon as the chemical pollution accident occurs from the plan repository, the technique evaluation index system was developed based on group decision-improved analytical hierarchy process (AHP), and has been tested by using a sudden aniline pollution accident that occurred in a river in December 2012. PMID:24887122

  12. Accident analysis of railway transportation of low-level radioactive and hazardous chemical wastes: Application of the /open quotes/Maximum Credible Accident/close quotes/ concept

    SciTech Connect

    Ricci, E.; McLean, R.B.

    1988-09-01

    The maximum credible accident (MCA) approach to accident analysis places an upper bound on the potential adverse effects of a proposed action by using conservative but simplifying assumptions. It is often used when data are lacking to support a more realistic scenario or when MCA calculations result in acceptable consequences. The MCA approach can also be combined with realistic scenarios to assess potential adverse effects. This report presents a guide for the preparation of transportation accident analyses based on the use of the MCA concept. Rail transportation of contaminated wastes is used as an example. The example is the analysis of the environmental impact of the potential derailment of a train transporting a large shipment of wastes. The shipment is assumed to be contaminated with polychlorinated biphenyls and low-level radioactivities of uranium and technetium. The train is assumed to plunge into a river used as a source of drinking water. The conclusions from the example accident analysis are based on the calculation of the number of foreseeable premature cancer deaths the might result as a consequence of this accident. These calculations are presented, and the reference material forming the basis for all assumptions and calculations is also provided.

  13. Countermeasures for radiocesium in animal products in Norway after the Chernobyl accident - techniques, effectiveness, and costs

    SciTech Connect

    Brynildsen, L.I.; Strand, P.; Hove, K.

    1996-05-01

    Nine years after the reactor accident in Chernobyl contamination by radiocesium is still a significant problem in sheep and reindeer production in Norway. To reduce the impact of the accident, effective countermeasures had to be developed and implemented. The levels of radiocesium in meat were reduced by a combination of countermeasures such a special feeding, use of cesium binders (bentonite and Prussian blue), and changing of slaughtering time. The countermeasures were labor intensive and expensive. Costs per averted dose per person-Sv were calculated to range from NOK 1,000 to 100,000 (7 NOK = $1 U.S.), with the use of cesium binders being the least expensive and condemnation of meat the most costly. Dietary advise, which did not include any compensation costs, had a cost of NOK 40 per person-Sv. Apart form the rejection of meat in 1986, countermeasures were deemed to be justified on a cost-benefit basis (less than NOK 600,000 per person-Sv). 26 refs., 1 fig., 4 tabs.

  14. Analysis of fission product release behavior during the TMI-2 accident

    SciTech Connect

    Petti, D. A.; Adams, J. P.; Anderson, J. L.; Hobbins, R. R.

    1987-01-01

    An analysis of fission product release during the Three Mile Island Unit 2 (TMI-2) accident has been initiated to provide an understanding of fission product behavior that is consistent with both the best estimate accident scenario and fission product results from the ongoing sample acquisition and examination efforts. ''First principles'' fission product release models are used to describe release from intact, disrupted, and molten fuel. Conclusions relating to fission product release, transport, and chemical form are drawn. 35 refs., 12 figs., 7 tabs.

  15. Multicomponent analysis using established techniques

    NASA Astrophysics Data System (ADS)

    Dillehay, David L.

    1991-04-01

    Recent environmental concerns have greatly increased the need, application and scope of real-time continuous emission monitoring systems. New techniques like Fourier Transform Infrared have been applied with limited success for this application. However, the use of well-tried and established techniques (Gas Filter Correlation and Single Beam Dual Wavelength) combined with sophisticated microprocessor technology have produced reliable monitoring systems with increased measurement accuracy.

  16. A comparison of two micro-beam X-ray emission techniques for actinide elemental distribution in microscopic particles originating from the hydrogen bombs involved in the Palomares (Spain) and Thule (Greenland) accidents

    NASA Astrophysics Data System (ADS)

    Jimenez-Ramos, M. C.; Eriksson, M.; García-López, J.; Ranebo, Y.; García-Tenorio, R.; Betti, M.; Holm, E.

    2010-09-01

    In order to validate and to gain confidence in two micro-beam techniques: particle induced X-ray emission with nuclear microprobe technique (μ-PIXE) and synchrotron radiation induced X-ray fluorescence in a confocal alignment (confocal SR μ-XRF) for characterization of microscopic particles containing actinide elements (mixed plutonium and uranium) a comparative study has been performed. Inter-comparison of the two techniques is essential as the X-ray production cross-sections for U and Pu are different for protons and photons and not well defined in the open literature, especially for Pu. The particles studied consisted of nuclear weapons material, and originate either in the so called Palomares accident in Spain, 1966 or in the Thule accident in Greenland, 1968. In the determination of the average Pu/U mass ratios (not corrected by self-absorption) in the analysed microscopic particles the results from both techniques show a very good agreement. In addition, the suitability of both techniques for the analysis with good resolution (down to a few μm) of the Pu/U distribution within the particles has been proved. The set of results obtained through both techniques has allowed gaining important information concerning the characterization of the remaining fissile material in the areas affected by the aircraft accidents. This type of information is essential for long-term impact assessments of contaminated sites.

  17. Preliminary Analysis of Aircraft Loss of Control Accidents: Worst Case Precursor Combinations and Temporal Sequencing

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Groff, Loren; Newman, Richard L.; Foster, John V.; Crider, Dennis H.; Klyde, David H.; Huston, A. McCall

    2014-01-01

    Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes, and can result from a wide spectrum of hazards, often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and their validation must provide a means of assessing system effectiveness and coverage of these hazards. This requires the definition of a comprehensive set of LOC test scenarios based on accident and incident data as well as future risks. This paper defines a comprehensive set of accidents and incidents over a recent 15 year period, and presents preliminary analysis results to identify worst-case combinations of causal and contributing factors (i.e., accident precursors) and how they sequence in time. Such analyses can provide insight in developing effective solutions for LOC, and form the basis for developing test scenarios that can be used in evaluating them. Preliminary findings based on the results of this paper indicate that system failures or malfunctions, crew actions or inactions, vehicle impairment conditions, and vehicle upsets contributed the most to accidents and fatalities, followed by inclement weather or atmospheric disturbances and poor visibility. Follow-on research will include finalizing the analysis through a team consensus process, defining future risks, and developing a comprehensive set of test scenarios with correlation to the accidents, incidents, and future risks. Since enhanced engineering simulations are required for batch and piloted evaluations under realistic LOC precursor conditions, these test scenarios can also serve as a high-level requirement for defining the engineering simulation enhancements needed for generating them.

  18. The Role of Materials Degradation and Analysis in the Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steven J.

    2006-01-01

    The efforts following the loss of the Space Shuttle Columbia included debris recovery, reconstruction, and analysis. The debris was subjected to myriad quantitative and semiquantitative chemical analysis techniques, ranging from examination via the scanning electron microscope (SEM) with energy dispersive spectrometer (EDS) to X-Ray diffraction (XRD) and electron probe micro-analysis (EPMA). The results from the work with the debris helped the investigators determine the location where a breach likely occurred in the leading edge of the left wing during lift off of the Orbiter from the Kennedy Space Center. Likewise, the information evidenced by the debris was also crucial in ascertaining the path of impinging plasma flow once it had breached the wing. After the Columbia Accident Investigation Board (CAIB) issued its findings, the major portion of the investigation was concluded. However, additional work remained to be done on many pieces of debris from portions of the Orbiter which were not directly related to the initial impact during ascent. This subsequent work was not only performed in the laboratory, but was also performed with portable equipment, including examination via portable X-Ray fluorescence (XRF) and Fourier transform infrared spectroscopy (FTIR). Likewise, acetate and silicon-rubber replicas of various fracture surfaces were obtained for later macroscopic and fractographic examination. This paper will detail the efforts and findings from the initial investigation, as well as present results obtained by the later examination and analysis of debris from the Orbiter including its windows, bulkhead structures, and other components which had not been examined during the primary investigation.

  19. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines

    PubMed Central

    Baka, Aikaterini D.; Uzunoglu, Nikolaos K.

    2014-01-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  20. Analysis of Two Electrocution Accidents in Greece that Occurred due to Unexpected Re-energization of Power Lines.

    PubMed

    Baka, Aikaterini D; Uzunoglu, Nikolaos K

    2014-09-01

    Investigation and analysis of accidents are critical elements of safety management. The over-riding purpose of an organization in carrying out an accident investigation is to prevent similar accidents, as well as seek a general improvement in the management of health and safety. Hundreds of workers have suffered injuries while installing, maintaining, or servicing machinery and equipment due to sudden re-energization of power lines. This study presents and analyzes two electrical accidents (1 fatal injury and 1 serious injury) that occurred because the power supply was reconnected inadvertently or by mistake. PMID:25379331

  1. SAMPSON Parallel Computation for Sensitivity Analysis of TEPCO's Fukushima Daiichi Nuclear Power Plant Accident

    NASA Astrophysics Data System (ADS)

    Pellegrini, M.; Bautista Gomez, L.; Maruyama, N.; Naitoh, M.; Matsuoka, S.; Cappello, F.

    2014-06-01

    On March 11th 2011 a high magnitude earthquake and consequent tsunami struck the east coast of Japan, resulting in a nuclear accident unprecedented in time and extents. After scram started at all power stations affected by the earthquake, diesel generators began operation as designed until tsunami waves reached the power plants located on the east coast. This had a catastrophic impact on the availability of plant safety systems at TEPCO's Fukushima Daiichi, leading to the condition of station black-out from unit 1 to 3. In this article the accident scenario is studied with the SAMPSON code. SAMPSON is a severe accident computer code composed of hierarchical modules to account for the diverse physics involved in the various phases of the accident evolution. A preliminary parallelization analysis of the code was performed using state-of-the-art tools and we demonstrate how this work can be beneficial to the nuclear safety analysis. This paper shows that inter-module parallelization can reduce the time to solution by more than 20%. Furthermore, the parallel code was applied to a sensitivity study for the alternative water injection into TEPCO's Fukushima Daiichi unit 3. Results show that the core melting progression is extremely sensitive to the amount and timing of water injection, resulting in a high probability of partial core melting for unit 3.

  2. Probabilistic analysis of accident precursors in the nuclear industry.

    PubMed

    Hulsmans, M; De Gelder, P

    2004-07-26

    Feedback of operating experience has always been an important issue in the nuclear industry. A probabilistic safety analysis (PSA) can be used as a tool to analyse how an operational event might have developed adversely in order to obtain a quantitative assessment of the safety significance of the event. This process is called PSA-based event analysis (PSAEA). A comprehensive set of PSAEA guidelines was developed by an international project. The main characteristics of this methodology are summarised. This approach to analyse incidents can be used to meet different objectives of utilities or nuclear regulators. The paper describes the main objectives and the experiences of the Belgian nuclear regulatory organisation AVN with the application of PSA-based event analysis. Some interesting aspects of the process of PSAEA are further developed and underlined. Several case studies are discussed and an overview of the obtained results is given. Finally, the interest of a broad and interactive forum on PSAEA is highlighted. PMID:15231351

  3. Detailed fuel spray analysis techniques

    NASA Technical Reports Server (NTRS)

    Mularz, E. J.; Bosque, M. A.; Humenik, F. M.

    1983-01-01

    Detailed fuel spray analyses are a necessary input to the analytical modeling of the complex mixing and combustion processes which occur in advanced combustor systems. It is anticipated that by controlling fuel-air reaction conditions, combustor temperatures can be better controlled, leading to improved combustion system durability. Thus, a research program is underway to demonstrate the capability to measure liquid droplet size, velocity, and number density throughout a fuel spray and to utilize this measurement technique in laboratory benchmark experiments. The research activities from two contracts and one grant are described with results to data. The experiment to characterize fuel sprays is also described. These experiments and data should be useful for application to and validation of turbulent flow modeling to improve the design systems of future advanced technology engines.

  4. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  5. Modern accident investigation and analysis - An executive guide

    NASA Astrophysics Data System (ADS)

    Ferry, T. S.

    The first part of the book primes the reader for mishap investigation. Three chapters lead into the serious business of investigation through a discussion of the need for and examination of who has a stake in investigation. This is followed by coverage of the preparation that makes an efficient investigation possible. Finally a description is presented of the first important steps in the investigation, conducted at the scene of a mishap. The interacting roles of man, environment, and systems are examined, taking into account unsafe acts, human limitations, the various types of environments, different types of materials, and aspects of systems investigation. Attention is also given to analytical techniques, the mishap report, information collection, and legal aspects of investigation.

  6. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Gregg L. Sharp; R. T. McCracken

    2003-06-01

    The regulatory requirement to develop an upgraded safety basis for a DOE nuclear facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830).1 Subpart B of 10 CFR 830, “Safety Basis Requirements,” requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements.1 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, “Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants”2 as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  7. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    SciTech Connect

    Sharp, G.L.; McCracken, R.T.

    2003-05-13

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830.

  8. Retrospective reconstruction of Ioidne-131 distribution at the Fukushima Daiichi Nuclear Power Plant accident by analysis of Ioidne-129

    NASA Astrophysics Data System (ADS)

    Matsuzaki, Hiroyuki; Muramatsu, Yasuyuki; Toyama, Chiaki; Ohno, Takeshi; Kusuno, Haruka; Miyake, Yasuto; Honda, Maki

    2014-05-01

    Among various radioactive nuclides emitted from the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident, Iodine-131 displayed high radioactivity just after the accident. Moreover if taken into human body, Iodine-131 concentrates in the thyroid and may cause the thyroid cancer. The recognition about the risk of Iodine-131 dose originated from the experience of the Chernobyl accident based on the epidemiological study [1]. It is thus important to investigate the detailed deposition distribution of I-131 to evaluate the radiation dose due to I-131 and watch the influence on the human health. However I-131 decays so rapidly (half life = 8.02 d) that it cannot be detected several months after the accident. At the recognition of the risk of I-131 on the Chernobyl occasion, it had gone several years after the accident. The reconstruction of I-131 distribution from Cs-137 distribution was not successful because the behavior of iodine and cesium was different because they have different chemical properties. Long lived radioactive isotope I-129 (half life = 1.57E+7 yr,), which is also a fission product as well as I-131, is ideal proxy for I-131 because they are chemically identical. Several studies had tried to quantify I-129 in 1990's but the analytical technique, especially AMS (Accelerator Mass Spectrometry), had not been developed well and available AMS facility was limited. Moreover because of the lack of enough data on I-131 just after the accident, the isotopic ratio I-129/I-131 of the Chernobyl derived iodine could not been estimated precisely [2]. Calculated estimation of the isotopic ratio showed scattered results. On the other hand, at the FDNPP accident detailed I-131 distribution is going to be successfully reconstructed by the systematical I-129 measurements by our group. We measured soil samples selected from a series of soil collection taken from every 2 km (or 5km, in the distant area) meshed region around FDNPP conducted by the Japanese Ministry of

  9. Combining task analysis and fault tree analysis for accident and incident analysis: a case study from Bulgaria.

    PubMed

    Doytchev, Doytchin E; Szwillus, Gerd

    2009-11-01

    Understanding the reasons for incident and accident occurrence is important for an organization's safety. Different methods have been developed to achieve this goal. To better understand the human behaviour in incident occurrence we propose an analysis concept that combines Fault Tree Analysis (FTA) and Task Analysis (TA). The former method identifies the root causes of an accident/incident, while the latter analyses the way people perform the tasks in their work environment and how they interact with machines or colleagues. These methods were complemented with the use of the Human Error Identification in System Tools (HEIST) methodology and the concept of Performance Shaping Factors (PSF) to deepen the insight into the error modes of an operator's behaviour. HEIST shows the external error modes that caused the human error and the factors that prompted the human to err. To show the validity of the approach, a case study at a Bulgarian Hydro power plant was carried out. An incident - the flooding of the plant's basement - was analysed by combining the afore-mentioned methods. The case study shows that Task Analysis in combination with other methods can be applied successfully to human error analysis, revealing details about erroneous actions in a realistic situation. PMID:19819365

  10. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  11. Coupled thermal analysis applied to the study of the rod ejection accident

    SciTech Connect

    Gonnet, M.

    2012-07-01

    An advanced methodology for the assessment of fuel-rod thermal margins under RIA conditions has been developed by AREVA NP SAS. With the emergence of RIA analytical criteria, the study of the Rod Ejection Accident (REA) would normally require the analysis of each fuel rod, slice by slice, over the whole core. Up to now the strategy used to overcome this difficulty has been to perform separate analyses of sampled fuel pins with conservative hypotheses for thermal properties and boundary conditions. In the advanced methodology, the evaluation model for the Rod Ejection Accident (REA) integrates the node average fuel and coolant properties calculation for neutron feedback purpose as well as the peak fuel and coolant time-dependent properties for criteria checking. The calculation grid for peak fuel and coolant properties can be specified from the assembly pitch down to the cell pitch. The comparative analysis of methodologies shows that coupled methodology allows reducing excessive conservatism of the uncoupled approach. (authors)

  12. Analysis of the FeCrAl Accident Tolerant Fuel Concept Benefits during BWR Station Blackout Accidents

    SciTech Connect

    Robb, Kevin R

    2015-01-01

    Iron-chromium-aluminum (FeCrAl) alloys are being considered for fuel concepts with enhanced accident tolerance. FeCrAl alloys have very slow oxidation kinetics and good strength at high temperatures. FeCrAl could be used for fuel cladding in light water reactors and/or as channel box material in boiling water reactors (BWRs). To estimate the potential safety gains afforded by the FeCrAl concept, the MELCOR code was used to analyze a range of postulated station blackout severe accident scenarios in a BWR/4 reactor employing FeCrAl. The simulations utilize the most recently known thermophysical properties and oxidation kinetics for FeCrAl. Overall, when compared to the traditional Zircaloy-based cladding and channel box, the FeCrAl concept provides a few extra hours of time for operators to take mitigating actions and/or for evacuations to take place. A coolable core geometry is retained longer, enhancing the ability to stabilize an accident. Finally, due to the slower oxidation kinetics, substantially less hydrogen is generated, and the generation is delayed in time. This decreases the amount of non-condensable gases in containment and the potential for deflagrations to inhibit the accident response.

  13. Human and organisational factors in maritime accidents: analysis of collisions at sea using the HFACS.

    PubMed

    Chauvin, Christine; Lardjane, Salim; Morel, Gaël; Clostermann, Jean-Pierre; Langard, Benoît

    2013-10-01

    Over the last decade, the shipping industry has implemented a number of measures aimed at improving its safety level (such as new regulations or new forms of team training). Despite this evolution, shipping accidents, and particularly collisions, remain a major concern. This paper presents a modified version of the Human Factors Analysis and Classification System, which has been adapted to the maritime context and used to analyse human and organisational factors in collisions reported by the Marine Accident and Investigation Branch (UK) and the Transportation Safety Board (Canada). The analysis shows that most collisions are due to decision errors. At the precondition level, it highlights the importance of the following factors: poor visibility and misuse of instruments (environmental factors), loss of situation awareness or deficit of attention (conditions of operators), deficits in inter-ship communications or Bridge Resource Management (personnel factors). At the leadership level, the analysis reveals the frequent planning of inappropriate operations and non-compliance with the Safety Management System (SMS). The Multiple Accident Analysis provides an important finding concerning three classes of accidents. Inter-ship communications problems and Bridge Resource Management deficiencies are closely linked to collisions occurring in restricted waters and involving pilot-carrying vessels. Another class of collisions is associated with situations of poor visibility, in open sea, and shows deficiencies at every level of the socio-technical system (technical environment, condition of operators, leadership level, and organisational level). The third class is characterised by non-compliance with the SMS. This study shows the importance of Bridge Resource Management for situations of navigation with a pilot on board in restricted waters. It also points out the necessity to investigate, for situations of navigation in open sea, the masters' decisions in critical conditions

  14. BNL severe-accident sequence experiments and analysis program. [PWR; BWR

    SciTech Connect

    Greene, G.A.; Ginsberg, T.; Tutu, N.K.

    1983-01-01

    In the analysis of degraded core accidents, the two major sources of pressure loading on light water reactor containments are: steam generation from core debris-water thermal interactions; and molten core-concrete interactions. Experiments are in progress at BNL in support of analytical model development related to aspects of the above containment loading mechanisms. The work supports development and evaluation of the CORCON (Muir, 1981) and MARCH (Wooton, 1980) computer codes. Progress in the two programs is described.

  15. Comparing Techniques for Certified Static Analysis

    NASA Technical Reports Server (NTRS)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  16. An association between dietary habits and traffic accidents in patients with chronic liver disease: A data-mining analysis

    PubMed Central

    KAWAGUCHI, TAKUMI; SUETSUGU, TAKURO; OGATA, SHYOU; IMANAGA, MINAMI; ISHII, KUMIKO; ESAKI, NAO; SUGIMOTO, MASAKO; OTSUYAMA, JYURI; NAGAMATSU, AYU; TANIGUCHI, EITARO; ITOU, MINORU; ORIISHI, TETSUHARU; IWASAKI, SHOKO; MIURA, HIROKO; TORIMURA, TAKUJI

    2016-01-01

    The incidence of traffic accidents in patients with chronic liver disease (CLD) is high in the USA. However, the characteristics of patients, including dietary habits, differ between Japan and the USA. The present study investigated the incidence of traffic accidents in CLD patients and the clinical profiles associated with traffic accidents in Japan using a data-mining analysis. A cross-sectional study was performed and 256 subjects [148 CLD patients (CLD group) and 106 patients with other digestive diseases (disease control group)] were enrolled; 2 patients were excluded. The incidence of traffic accidents was compared between the two groups. Independent factors for traffic accidents were analyzed using logistic regression and decision-tree analyses. The incidence of traffic accidents did not differ between the CLD and disease control groups (8.8 vs. 11.3%). The results of the logistic regression analysis showed that yoghurt consumption was the only independent risk factor for traffic accidents (odds ratio, 0.37; 95% confidence interval, 0.16–0.85; P=0.0197). Similarly, the results of the decision-tree analysis showed that yoghurt consumption was the initial divergence variable. In patients who consumed yoghurt habitually, the incidence of traffic accidents was 6.6%, while that in patients who did not consume yoghurt was 16.0%. CLD was not identified as an independent factor in the logistic regression and decision-tree analyses. In conclusion, the difference in the incidence of traffic accidents in Japan between the CLD and disease control groups was insignificant. Furthermore, yoghurt consumption was an independent negative risk factor for traffic accidents in patients with digestive diseases, including CLD. PMID:27123257

  17. A New Microcell Technique for NMR Analysis.

    ERIC Educational Resources Information Center

    Yu, Sophia J.

    1987-01-01

    Describes a new laboratory technique for working with small samples of compounds used in nuclear magnetic resonance (NMR) analysis. Demonstrates how microcells can be constructed for each experiment and samples can be recycled. (TW)

  18. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  19. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  20. Thermodynamic analysis of cesium and iodine behavior in severe light water reactor accidents

    NASA Astrophysics Data System (ADS)

    Minato, Kazuo

    1991-11-01

    In order to understand the release and transport behavior of cesium (Cs) and iodine (I) in severe light water reactor accidents, chemical forms of Cs and I in steam-hydrogen mixtures were analyzed thermodynamically. In the calculations reactions of boron (B) with Cs were taken into consideration. The analysis showed that B plays an important role in determining chemical forms of Cs. The main Cs-containing species are CsBO 2(g) and CsBO 2(l), depending on temperature. The contribution of CsOH(g) is minor. The main I-containing species are HI(g) and CsI(g) over the wide ranges of the parameters considered. Calculations were also carried out under the conditions of the Three Mile Island Unit 2 accident.

  1. Preliminary analysis of graphite dust releasing behavior in accident for HTR

    SciTech Connect

    Peng, W.; Yang, X. Y.; Yu, S. Y.; Wang, J.

    2012-07-01

    The behavior of the graphite dust is important to the safety of High Temperature Gas-cooled Reactors. This study investigated the flow of graphite dust in helium mainstream. The analysis of the stresses acting on the graphite dust indicated that gas drag played the absolute leading role. Based on the understanding of the importance of gas drag, an experimental system is set up for the research of dust releasing behavior in accident. Air driven by centrifugal fan is used as the working fluid instead of helium because helium is expensive, easy to leak which make it difficult to seal. The graphite particles, with the size distribution same as in HTR, are added to the experiment loop. The graphite dust releasing behavior at the loss-of-coolant accident will be investigated by a sonic nozzle. (authors)

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  3. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  4. Safety culture and accident analysis--a socio-management approach based on organizational safety social capital.

    PubMed

    Rao, Suman

    2007-04-11

    One of the biggest challenges for organizations in today's competitive business environment is to create and preserve a self-sustaining safety culture. Typically, the key drivers of safety culture in many organizations are regulation, audits, safety training, various types of employee exhortations to comply with safety norms, etc. However, less evident factors like networking relationships and social trust amongst employees, as also extended networking relationships and social trust of organizations with external stakeholders like government, suppliers, regulators, etc., which constitute the safety social capital in the Organization--seem to also influence the sustenance of organizational safety culture. Can erosion in safety social capital cause deterioration in safety culture and contribute to accidents? If so, how does it contribute? As existing accident analysis models do not provide answers to these questions, CAMSoC (Curtailing Accidents by Managing Social Capital), an accident analysis model, is proposed. As an illustration, five accidents: Bhopal (India), Hyatt Regency (USA), Tenerife (Canary Islands), Westray (Canada) and Exxon Valdez (USA) have been analyzed using CAMSoC. This limited cross-industry analysis provides two key socio-management insights: the biggest source of motivation that causes deviant behavior leading to accidents is 'Faulty Value Systems'. The second biggest source is 'Enforceable Trust'. From a management control perspective, deterioration in safety culture and resultant accidents is more due to the 'action controls' rather than explicit 'cultural controls'. Future research directions to enhance the model's utility through layering are addressed briefly. PMID:16911855

  5. Analysis of pedestrian accident costs in Sudan using the willingness-to-pay method.

    PubMed

    Mofadal, Adam I A; Kanitpong, Kunnawee; Jiwattanakulpaisarn, Piyapong

    2015-05-01

    The willingness-to-pay (WTP) with contingent valuation (CV) method has been proven to be a valid tool for the valuation of non-market goods or socio-economic costs of road traffic accidents among communities in developed and developing countries. Research on accident costing tends to estimate the value of statistical life (VOSL) for all road users by providing a principle for the evaluation of road safety interventions in cost-benefit analysis. As in many other developing countries, the economic loss of traffic accidents in Sudan is noticeable; however, analytical research to estimate the magnitude and impact of that loss is lacking. Reports have shown that pedestrians account for more than 40% of the total number of fatalities. In this study, the WTP-CV approach was used to determine the amount of money that pedestrians in Sudan are willing to pay to reduce the risk of their own death. The impact of the socioeconomic factors, risk levels, and walking behaviors of pedestrians on their WTP for fatality risk reduction was also evaluated. Data were collected from two cities-Khartoum and Nyala-using a survey questionnaire that included 1400 respondents. The WTP-CV Payment Card Questionnaire was designed to ensure that Sudan pedestrians can easily determine the amount of money that would be required to reduce the fatality risk from a pedestrian-related accident. The analysis results show that the estimated VOSL for Sudanese pedestrians ranges from US$0.019 to US$0.101 million. In addition, the willingness-to-pay by Sudanese pedestrians to reduce their fatality risk tends to increase with age, household income, educational level, safety perception, and average time spent on social activities with family and community. PMID:25794921

  6. The Application of Electron Microscopy Techniques to the Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    Shah, Sandeep; Jerman, Greg

    2005-01-01

    The Space Shuttle Columbia was returning from a 16-day research mission, STS- 107, with nominal system performance prior to the beginning of the entry interface into earth's upper atmosphere. Approximately one minute and twenty four seconds into the peak heating region of the entry interface, an off-nominal temperature rise was observed in the left main landing gear brake line. Nearly seven minutes later, all contact was lost with Columbia. Debris was observed periodically exiting the Shuttle's flight path throughout the reentry profile over California, Nevada, and New Mexico, until its final breakup over Texas. During the subsequent investigation, electron microscopy techniques were crucial in revealing the location of the fatal damage that resulted in the loss of Columbia and her crew.

  7. Calculation notes in support of TWRS FSAR spray leak accident analysis

    SciTech Connect

    Hall, B.W.

    1996-09-25

    This document contains the detailed calculations that support the spray leak accident analysis in the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The consequence analyses in this document form the basis for the selection of controls to mitigate or prevent spray leaks throughout TWRS. Pressurized spray leaks can occur due to a breach in containment barriers along transfer routes, during waste transfers. Spray leaks are of particular safety concern because, depending on leak dimensions, and waste pressure, they can be relatively efficient generators of dispersible sized aerosols that can transport downwind to onsite and offsite receptors. Waste is transferred between storage tanks and between processing facilities and storage tanks in TWRS through a system of buried transfer lines. Pumps for transferring waste and jumpers and valves for rerouting waste are located inside below grade pits and structures that are normally covered. Pressurized spray leaks can emanate to the atmosphere due to breaches in waste transfer associated equipment inside these structures should the structures be uncovered at the time of the leak. Pressurized spray leaks can develop through holes or cracks in transfer piping, valve bodies or pump casings caused by such mechanisms as corrosion, erosion, thermal stress, or water hammer. Leaks through degraded valve packing, jumper gaskets, or pump seals can also result in pressurized spray releases. Mechanisms that can degrade seals, packing and gaskets include aging, radiation hardening, thermal stress, etc. An1782other common cause for spray leaks inside transfer enclosures are misaligned jumpers caused by human error. A spray leak inside a DST valve pit during a transfer of aging waste was selected as the bounding, representative accident for detailed analysis. Sections 2 through 5 below develop this representative accident using the DOE- STD-3009 format. Sections 2 describes the unmitigated and mitigated accident

  8. Analysis 320 coal mine accidents using structural equation modeling with unsafe conditions of the rules and regulations as exogenous variables.

    PubMed

    Zhang, Yingyu; Shao, Wei; Zhang, Mengjia; Li, Hejun; Yin, Shijiu; Xu, Yingjun

    2016-07-01

    Mining has been historically considered as a naturally high-risk industry worldwide. Deaths caused by coal mine accidents are more than the sum of all other accidents in China. Statistics of 320 coal mine accidents in Shandong province show that all accidents contain indicators of "unsafe conditions of the rules and regulations" with a frequency of 1590, accounting for 74.3% of the total frequency of 2140. "Unsafe behaviors of the operator" is another important contributory factor, which mainly includes "operator error" and "venturing into dangerous places." A systems analysis approach was applied by using structural equation modeling (SEM) to examine the interactions between the contributory factors of coal mine accidents. The analysis of results leads to three conclusions. (i) "Unsafe conditions of the rules and regulations," affect the "unsafe behaviors of the operator," "unsafe conditions of the equipment," and "unsafe conditions of the environment." (ii) The three influencing factors of coal mine accidents (with the frequency of effect relation in descending order) are "lack of safety education and training," "rules and regulations of safety production responsibility," and "rules and regulations of supervision and inspection." (iii) The three influenced factors (with the frequency in descending order) of coal mine accidents are "venturing into dangerous places," "poor workplace environment," and "operator error." PMID:27085591

  9. Analysis on the Density Driven Air-Ingress Accident in VHTRs

    SciTech Connect

    Eung Soo Kim; Chang Oh; Richard Schultz; David Petti

    2008-11-01

    Air-ingress following the pipe rupture is considered to be the most serious accident in the VHTRs due to its potential problems such as core heat-up, structural integrity and toxic gas release. Previously, it has been believed that the main air-ingress mechanism of this accident is the molecular diffusion process between the reactor core and the cavity. However, according to some recent studies, there is another fast air-ingress process that has not been considered before. It is called density-driven stratified flow. The potential for density-driven stratified air ingress into the VHTR following a large-break LOCA was first described in the NGNP Methods Technical Program based on stratified flow studies performed with liquid. Studies on densitygradient driven stratified flow in advanced reactor systems has been the subject of active research for well over a decade since density-gradient dominated stratified flow is an inherent characteristic of passive systems used in advanced reactors. Recently, Oh et al. performed a CFD analysis on the stratified flow in the VHTR, and showed that this effect can significantly accelerate the air-ingress process in the VHTRs. They also proposed to replace the original air-ingress scenario based on the molecular diffusion with the one based on the stratified flow. This paper is focusing on the effect of stratified flow on the results of the air-ingress accident in VHTR

  10. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    SciTech Connect

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ``MELCOR Verification, Benchmarking, and Applications,`` whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR.

  11. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    NASA Astrophysics Data System (ADS)

    Reyes, S.; Latkowski, J. F.; Gomez del Rio, J.; Sanz, J.

    2001-05-01

    Previous studies of the safety and environmental aspects of the HYLIFE-II inertial fusion energy power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work, computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) have been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here we consider a severe loss of coolant accident (LOCA) in conjunction with simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the confinement) and of the two barriers surrounding the chamber (inner shielding and confinement building itself). Even though confinement failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product transport and release. The results of these calculations show that the estimated off-site dose is less than 5 mSv (0.5 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  12. Accident consequences analysis of the HYLIFE-II inertial fusion energy power plant design

    SciTech Connect

    Reyes, S; Gomez del Rio, J; Sanz, J

    2000-02-23

    Previous studies of the safety and environmental (S and E) aspects of the HYLIFE-II inertial fusion energy (IFE) power plant design have used simplistic assumptions in order to estimate radioactivity releases under accident conditions. Conservatisms associated with these traditional analyses can mask the actual behavior of the plant and have revealed the need for more accurate modeling and analysis of accident conditions and radioactivity mobilization mechanisms. In the present work a set of computer codes traditionally used for magnetic fusion safety analyses (CHEMCON, MELCOR) has been applied for simulating accident conditions in a simple model of the HYLIFE-II IFE design. Here the authors consider a severe lost of coolant accident (LOCA) producing simultaneous failures of the beam tubes (providing a pathway for radioactivity release from the vacuum vessel towards the containment) and of the two barriers surrounding the chamber (inner shielding and containment building it self). Even though containment failure would be a very unlikely event it would be needed in order to produce significant off-site doses. CHEMCON code allows calculation of long-term temperature transients in fusion reactor first wall, blanket, and shield structures resulting from decay heating. MELCOR is used to simulate a wide range of physical phenomena including thermal-hydraulics, heat transfer, aerosol physics and fusion product release and transport. The results of these calculations show that the estimated off-site dose is less than 6 mSv (0.6 rem), which is well below the value of 10 mSv (1 rem) given by the DOE Fusion Safety Standards for protection of the public from exposure to radiation during off-normal conditions.

  13. Light-Weight Radioisotope Heater Unit final safety analysis report (LWRHU-FSAR): Volume 2: Accident Model Document (AMD)

    SciTech Connect

    Johnson, E.W.

    1988-10-01

    The purpose of this volume of the LWRHU SAR, the Accident Model Document (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; Provide estimates of occurrence probabilities associated with these various accidents; Evaluate the response of the LWRHU (or its components) to the resultant accident environments; and Associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  14. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  15. Analysis of National Major Work Safety Accidents in China, 2003–2012

    PubMed Central

    YE, Yunfeng; ZHANG, Siheng; RAO, Jiaming; WANG, Haiqing; LI, Yang; WANG, Shengyong; DONG, Xiaomei

    2016-01-01

    Background: This study provides a national profile of major work safety accidents in China, which cause more than 10 fatalities per accident, intended to provide scientific basis for prevention measures and strategies to reduce major work safety accidents and deaths. Methods: Data from 2003–2012 Census of major work safety accidents were collected from State Administration of Work Safety System (SAWS). Published literature and statistical yearbook were also included to implement information. We analyzed the frequency of accidents and deaths, trend, geographic distribution and injury types. Additionally, we discussed the severity and urgency of emergency rescue by types of accidents. Results: A total of 877 major work safety accidents were reported, resulting in 16,795 deaths and 9,183 injuries. The numbers of accidents and deaths, mortality rate and incidence of major accidents have declined in recent years. The mortality rate and incidence was 0.71 and 1.20 per 106 populations in 2012, respectively. Transportation and mining contributed to the highest number of major accidents and deaths. Major aviation and railway accidents caused more casualties per incident, while collapse, machinery, electrical shock accidents and tailing dam accidents were the most severe situation that resulted in bigger proportion of death. Conclusion: Ten years’ major work safety accident data indicate that the frequency of accidents and number of eaths was declined and several safety concerns persist in some segments. PMID:27057515

  16. Radiation protection: an analysis of thyroid blocking. [Effectiveness of KI in reducing radioactive uptake following potential reactor accident

    SciTech Connect

    Aldrich, D C; Blond, R M

    1980-01-01

    An analysis was performed to provide guidance to policymakers concerning the effectiveness of potassium iodide (KI) as a thyroid blocking agent in potential reactor accident situations, the distance to which (or area within which) it should be distributed, and its relative effectiveness compared to other available protective measures. The analysis was performed using the Reactor Safety Study (WASH-1400) consequence model. Four categories of accidents were addressed: gap activity release accident (GAP), GAP without containment isolation, core melt with a melt-through release, and core melt with an atmospheric release. Cost-benefit ratios (US $/thyroid nodule prevented) are given assuming that no other protective measures are taken. Uncertainties due to health effects parameters, accident probabilities, and costs are assessed. The effects of other potential protective measures, such as evacuation and sheltering, and the impact on children (critical population) are evaluated. Finally, risk-benefit considerations are briefly discussed.

  17. Emerging techniques for ultrasensitive protein analysis.

    PubMed

    Yang, Xiaolong; Tang, Yanan; Alt, Ryan R; Xie, Xiaoyu; Li, Feng

    2016-06-21

    Many important biomarkers for devastating diseases and biochemical processes are proteins present at ultralow levels. Traditional techniques, such as enzyme-linked immunosorbent assays (ELISA), mass spectrometry, and protein microarrays, are often not sensitive enough to detect proteins with concentrations below the picomolar level, thus requiring the development of analytical techniques with ultrahigh sensitivities. In this review, we highlight the recent advances in developing novel techniques, sensors, and assays for ultrasensitive protein analysis. Particular attention will be focused on three classes of signal generation and/or amplification mechanisms, including the uses of nanomaterials, nucleic acids, and digital platforms. PMID:26898911

  18. Light-Weight Radioisotope Heater Unit Safety Analysis Report (LWRHU-SAR). Volume II. Accident model document

    SciTech Connect

    Johnson, E.W.

    1985-10-01

    Purposes of this volume (AMD), are to: Identify all malfunctions, both singular and multiple, which can occur during the complete mission profile that could lead to release outside the clad of the radioisotopic material contained therein; provide estimates of occurrence probabilities associated with these various accidents; evaluate the response of the LWRHU (or its components) to the resultant accident environments; and associate the potential event history with test data or analysis to determine the potential interaction of the released radionuclides with the biosphere.

  19. A review of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  20. Hypothetical accident condition thermal analysis and testing of a Type B drum package

    SciTech Connect

    Hensel, S.J.; Alstine, M.N. Van; Gromada, R.J.

    1995-07-01

    A thermophysical property model developed to analytically determine the thermal response of cane fiberboard when exposed to temperatures and heat fluxes associated with the 10 CFR 71 hypothetical accident condition (HAC) has been benchmarked against two Type B drum package fire test results. The model 9973 package was fire tested after a 30 ft. top down drop and puncture, and an undamaged model 9975 package containing a heater (21W) was fire tested to determine content heat source effects. Analysis results using a refined version of a previously developed HAC fiberboard model compared well against the test data from both the 9973 and 9975 packages.

  1. Accident sequence analysis for a BWR (Boiling Water Reactor) during low power and shutdown operations

    SciTech Connect

    Whitehead, D.W.; Hake, T.M.

    1990-01-01

    Most previous Probabilistic Risk Assessments have excluded consideration of accidents initiated in low power and shutdown modes of operation. A study of the risk associated with operation in low power and shutdown is being performed at Sandia National Laboratories for a US Boiling Water Reactor (BWR). This paper describes the proposed methodology for the analysis of the risk associated with the operation of a BWR during low power and shutdown modes and presents preliminary information resulting from the application of the methodology. 2 refs., 2 tabs.

  2. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    SciTech Connect

    Pasichnyk, I.; Perin, Y.; Velkov, K.

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  3. Analysis of Maximum Reasonably Foreseeable Accidents for the Yucca Mountain Draft Environmental Impact Statement (DEIS)

    SciTech Connect

    S.B. Ross; R.E. Best; S.J. Maheras; T.I. McSweeney

    2001-08-17

    Accidents could occur during the transportation of spent nuclear fuel and high-level radioactive waste. This paper describes the risks and consequences to the public from accidents that are highly unlikely but that could have severe consequences. The impact of these accidents would include those to a collective population and to hypothetical maximally exposed individuals (MEIs). This document discusses accidents with conditions that have a chance of occurring more often than 1 in 10 million times in a year, called ''maximum reasonably foreseeable accidents''. Accidents and conditions less likely than this are not considered to be reasonably foreseeable.

  4. A systemic approach to accident analysis: a case study of the Stockwell shooting.

    PubMed

    Jenkins, Daniel P; Salmon, Paul M; Stanton, Neville A; Walker, Guy H

    2010-01-01

    This paper uses a systemic approach to accident investigation, based upon AcciMaps, to model the events leading up to the shooting of Jean Charles de Menezes at Stockwell Underground station in July 2005. The model captures many of the findings of the Independent Police Complaints Commission's report in a single representation, modelling their interdependencies and the causal flow. Furthermore, by taking a systemic approach, the analysis identifies further considerations related to the suitability of the Metropolitan Police Service's organisational structure to support rapid-paced operations, where reliable identification of a suspect is not possible. Based upon the analysis, the paper questions the division of functions between teams and the suitability of an organisational structure that relies upon the complex flow of information between separate teams for surveillance and for controlling the suspect. A dynamic organisational structure is proposed that changes in response to operation type and unfolding events. STATEMENT OF RELEVANCE: This paper provides much needed and called for validation for a systemic approach to accident analysis. A widely reported case study is used to illustrate the process. The paper shows how such an approach can consolidate the key findings of much larger reports as well as draw out additional recommendations. PMID:20069477

  5. Bicycle accidents often cause disability--an analysis of medical and social consequences of nonfatal bicycle accidents.

    PubMed

    Olkkonen, S; Lahdenranta, U; Slätis, P; Honkanen, R

    1993-06-01

    Social and medical consequences of 278 children and 264 adults injured in bicycle accidents and seen in two hospitals in Helsinki in 1985-86 were analyzed. Information was collected from patient records, by means of a special questionnaire and by telephone interview. A child outpatient required 1.7 and a child inpatient 3.0 physician visits on an average, while adults required 2.2 and 4.9 visits, respectively. The average duration of hospital stay was 8 days for hospitalized adults and 6 days for children. Rehabilitative care outside the hospital was received by 6% of the adult outpatients and 25% of the inpatients, but none of the injured children. The mean duration of work disability was 82 days among inpatients, 11 days among outpatients, 127 days among the inpatients injured in motor vehicle collisions and 65 days among inpatients injured in other bicycle accidents. Of inpatients 32% and of outpatients 5% reported persistent (> 6 months) disability. Persistent disability was recorded in 11% of children, in 47% of adults and in 67% of elderly inpatients. Most serious consequences were due to intracranial injuries in motor vehicle-bicycle collisions. Of the hospitalized bicyclists 4% suffered from severe cognitive and behavioural changes or sense impairment and of adult inpatients 3% suffered from permanent work disability. The average costs of health and social services were about FIM 1000 per adult outpatient and FIM 13000 per adult inpatient. In prevention high priority should be given to motor vehicle collisions, head injuries and injuries among the elderly bicyclists. PMID:8367689

  6. Gold analysis by the gamma absorption technique.

    PubMed

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  7. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  8. NASA Structural Analysis Report on the American Airlines Flight 587 Accident - Local Analysis of the Right Rear Lug

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S; Glaessgen, Edward H.; Mason, Brian H; Krishnamurthy, Thiagarajan; Davila, Carlos G

    2005-01-01

    A detailed finite element analysis of the right rear lug of the American Airlines Flight 587 - Airbus A300-600R was performed as part of the National Transportation Safety Board s failure investigation of the accident that occurred on November 12, 2001. The loads experienced by the right rear lug are evaluated using global models of the vertical tail, local models near the right rear lug, and a global-local analysis procedure. The right rear lug was analyzed using two modeling approaches. In the first approach, solid-shell type modeling is used, and in the second approach, layered-shell type modeling is used. The solid-shell and the layered-shell modeling approaches were used in progressive failure analyses (PFA) to determine the load, mode, and location of failure in the right rear lug under loading representative of an Airbus certification test conducted in 1985 (the 1985-certification test). Both analyses were in excellent agreement with each other on the predicted failure loads, failure mode, and location of failure. The solid-shell type modeling was then used to analyze both a subcomponent test conducted by Airbus in 2003 (the 2003-subcomponent test) and the accident condition. Excellent agreement was observed between the analyses and the observed failures in both cases. From the analyses conducted and presented in this paper, the following conclusions were drawn. The moment, Mx (moment about the fuselage longitudinal axis), has significant effect on the failure load of the lugs. Higher absolute values of Mx give lower failure loads. The predicted load, mode, and location of the failure of the 1985-certification test, 2003-subcomponent test, and the accident condition are in very good agreement. This agreement suggests that the 1985-certification and 2003- subcomponent tests represent the accident condition accurately. The failure mode of the right rear lug for the 1985-certification test, 2003-subcomponent test, and the accident load case is identified as a

  9. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. PMID:21094332

  10. Development of integrated core disruptive accident analysis code for FBR - ASTERIA-FBR

    SciTech Connect

    Ishizu, T.; Endo, H.; Tatewaki, I.; Yamamoto, T.; Shirakawa, N.

    2012-07-01

    The evaluation of consequence at the severe accident is the most important as a safety licensing issue for the reactor core of liquid metal cooled fast breeder reactor (LMFBR), since the LMFBR core is not in an optimum condition from the viewpoint of reactivity. This characteristics might induce a super-prompt criticality due to the core geometry change during the core disruptive accident (CDA). The previous CDA analysis codes have been modeled in plural phases dependent on the mechanism driving a super-prompt criticality. Then, the following event is calculated by connecting different codes. This scheme, however, should introduce uncertainty and/or arbitrary to calculation results. To resolve the issues and obtain the consistent calculation results without arbitrary, JNES is developing the ASTERIA-FBR code for the purpose of providing the cross-check analysis code, which is another required scheme to confirm the validity of the evaluation results prepared by applicants, in the safety licensing procedure of the planned high performance core of Monju. ASTERIA-FBR consists of the three major calculation modules, CONCORD, dynamic-GMVP, and FEMAXI-FBR. CONCORD is a three-dimensional thermal-hydraulics calculation module with multi-phase, multi-component, and multi-velocity field model. Dynamic-GMVP is a space-time neutronics calculation module. FEMAXI-FBR calculates the fuel pellet deformation behavior and fuel pin failure behavior. This paper describes the needs of ASTERIA-FBR development, major module outlines, and the model validation status. (authors)

  11. Photogrammetric Techniques for Road Surface Analysis

    NASA Astrophysics Data System (ADS)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  12. Analysis of Occupational Accident Fatalities and Injuries Among Male Group in Iran Between 2008 and 2012

    PubMed Central

    Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Sepehri, Mohammad Mehdi

    2015-01-01

    Background: Because of occupational accidents, permanent disabilities and deaths occur and economic and workday losses emerge. Objectives: The purpose of the present study was to investigate the factors responsible for occupational accidents occurred in Iran. Patients and Methods: The current study analyzed 1464 occupational accidents recorded by the Ministry of Labor and Social Affairs’ offices in Iran during 2008 - 2012. At first, general understanding of accidents was obtained using descriptive statistics. Afterwards, the chi-square test and Cramer’s V statistic (Vc) were used to determine the association between factors influencing the type of injury as occupational accident outcomes. Results: There was no significant association between marital status and time of day with the type of injury. However, activity sector, cause of accident, victim’s education, age of victim and victim’s experience were significantly associated with the type of injury. Conclusions: Successful accident prevention relies largely on knowledge about the causes of accidents. In any accident control activity, particularly in occupational accidents, correctly identifying high-risk groups and factors influencing accidents is the key to successful interventions. Results of this study can cause to increase accident awareness and enable workplace’s management to select and prioritize problem areas and safety system weakness in workplaces. PMID:26568848

  13. A Content Analysis of News Media Coverage of the Accident at Three Mile Island.

    ERIC Educational Resources Information Center

    Stephens, Mitchell; Edison, Nadyne G.

    A study was conducted for the President's Commission on the Accident at Three Mile Island to analyze coverage of the accident by ten news organizations: two wire services, three commercial television networks, and five daily newspapers. Copies of all stories and transcripts of news programs during the first week of the accident were examined from…

  14. Aerosol particle analysis by Raman scattering technique

    SciTech Connect

    Fung, K.H.; Tang, I.N.

    1992-10-01

    Laser Raman spectroscopy is a very versatile tool for chemical characterization of micron-sized particles. Such particles are abundant in nature, and in numerous energy-related processes. In order to elucidate the formation mechanisms and understand the subsequent chemical transformation under a variety of reaction conditions, it is imperative to develop analytical measurement techniques for in situ monitoring of these suspended particles. In this report, we outline our recent work on spontaneous Raman, resonance Raman and non-linear Raman scattering as a novel technique for chemical analysis of aerosol particles as well as supersaturated solution droplets.

  15. The Gulf of Mexico oil rig accident: analysis by different SAR satellite images

    NASA Astrophysics Data System (ADS)

    Del Frate, Fabio; Giacomini, Andrea; Latini, Daniele; Solimini, Domenico; Emery, William J.

    2011-11-01

    The management of the monitoring oil spills over the sea surface is a very important and actual task for international environmental agencies, due to the continuous risks represented by possible accidents involving either rigs or tankers. On the other hand the increase of remote sensing space missions can definitely improve our capabilities in this kind of activity. In this paper we consider the dramatic Gulf of Mexico oil spill event of 2010 to investigate on the types of information that could be provided by the available SAR images collection which included different polarizations and bands. With an eye to the implementation of fully automatic processing chains, an assessment of a novel segmentation technique based on PCNN (Pulse Coupled Neural Networks) was also carried out.

  16. Effects of improved modeling on best estimate BWR severe accident analysis

    SciTech Connect

    Hyman, C.R.; Ott, L.J.

    1984-01-01

    Since 1981, ORNL has completed best estimate studies analyzing several dominant BWR accident scenarios. These scenarios were identified by early Probabilistic Risk Assessment (PRA) studies and detailed ORNL analysis complements such studies. In performing these studies, ORNL has used the MARCH code extensively. ORNL investigators have identified several deficiencies in early versions of MARCH with regard to BWR modeling. Some of these deficiencies appear to have been remedied by the most recent release of the code. It is the purpose of this paper to identify several of these deficiencies. All the information presented concerns the degraded core thermal/hydraulic analysis associated with each of the ORNL studies. This includes calculations of the containment response. The period of interest is from the time of permanent core uncovery to the end of the transient. Specific objectives include the determination of the extent of core damage and timing of major events (i.e., onset of Zr/H/sub 2/O reaction, initial clad/fuel melting, loss of control blade structure, etc.). As mentioned previously the major analysis tool used thus far was derived from an early version of MARCH. BWRs have unique features which must be modeled for best estimate severe accident analysis. ORNL has developed and incorporated into its version of MARCH several improved models. These include (1) channel boxes and control blades, (2) SRV actuations, (3) vessel water level, (4) multi-node analysis of in-vessel water inventory, (5) comprehensive hydrogen and water properties package, (6) first order correction to the ideal gas law, and (7) separation of fuel and cladding. Ongoing and future modeling efforts are required. These include (1) detailed modeling for the pressure suppression pool, (2) incorporation of B/sub 4/C/steam reaction models, (3) phenomenological model of corium mass transport, and (4) advanced corium/concrete interaction modeling. 10 references, 17 figures, 1 table.

  17. Narrative text analysis of accident reports with tractors, self-propelled harvesting machinery and materials handling machinery in Austrian agriculture from 2008 to 2010 - a comparison.

    PubMed

    Mayrhofer, Hannes; Quendler, Elisabeth; Boxberger, Josef

    2014-01-01

    The aim of this study was the identification of accident scenarios and causes by analysing existing accident reports of recognized agricultural occupational accidents with tractors, self-propelled harvesting machinery and materials handling machinery from 2008 to 2010. As a result of a literature-based evaluation of past accident analyses, the narrative text analysis was chosen as an appropriate method. A narrative analysis of the text fields of accident reports that farmers used to report accidents to insurers was conducted to obtain detailed information about the scenarios and causes of accidents. This narrative analysis of reports was made the first time and yielded first insights for identifying antecedents of accidents and potential opportunities for technical based intervention. A literature and internet search was done to discuss and confirm the findings. The narrative text analysis showed that in more than one third of the accidents with tractors and materials handling machinery the vehicle rolled or tipped over. The most relevant accident scenarios with harvesting machinery were being trapped and falling down. The direct comparison of the analysed machinery categories showed that more than 10% of the accidents in each category were caused by technical faults, slippery or muddy terrain and incorrect or inappropriate operation of the vehicle. Accidents with tractors, harvesting machinery and materials handling machinery showed similarities in terms of causes, circumstances and consequences. Certain technical and communicative measures for accident prevention could be used for all three machinery categories. Nevertheless, some individual solutions for accident prevention, which suit each specific machine type, would be necessary. PMID:24738521

  18. Analysis of Kuosheng Large-Break Loss-of-Coolant Accident with MELCOR 1.8.4

    SciTech Connect

    Wang, T.-C.; Wang, S.-J.; Chien, C.-S

    2000-09-15

    The MELCOR code, developed by Sandia National Laboratories, is capable of simulating the severe accident phenomena of light water reactor nuclear power plants (NPPs). A specific large-break loss-of-coolant accident (LOCA) for Kuosheng NPP is simulated with the use of the MELCOR 1.8.4 code. This accident is induced by a double-ended guillotine break of one of the recirculation pipes concurrent with complete failure of the emergency core cooling system. The MELCOR input deck for the Kuosheng NPP is established based on the design data of the Kuosheng NPP and the MELCOR users' guides. The initial steady-state conditions are generated with a developed self-initialization algorithm. The effect of the MELCOR 1.8.4-provided initialization process is demonstrated. The main severe accident phenomena and the corresponding fission product released fractions associated with the large-break LOCA sequences are simulated. The MELCOR 1.8.4 predicts a longer time interval between the core collapse and vessel failure and a higher source term. This MELCOR 1.8.4 input deck will be applied to the probabilistic risk assessment, the severe accident analysis, and the severe accident management study of the Kuosheng NPP in the near future.

  19. Electrical equipment performance under severe accident conditions (BWR/Mark 1 plant analysis): Summary report

    SciTech Connect

    Bennett, P.R.; Kolaczkowski, A.M.; Medford, G.T.

    1986-09-01

    The purpose of the Performance Evaluation of Electrical Equipment during Severe Accident States Program is to determine the performance of electrical equipment, important to safety, under severe accident conditions. In FY85, a method was devised to identify important electrical equipment and the severe accident environments in which the equipment was likely to fail. This method was used to evaluate the equipment and severe accident environments for Browns Ferry Unit 1, a BWR/Mark I. Following this work, a test plan was written in FY86 to experimentally determine the performance of one selected component to two severe accident environments.

  20. Analysis of station blackout accidents for the Bellefonte pressurized water reactor

    SciTech Connect

    Gasser, R D; Bieniarz, P P; Tills, J L

    1986-09-01

    An analysis has been performed for the Bellefonte PWR Unit 1 to determine the containment loading and the radiological releases into the environment from a station blackout accident. A number of issues have been addressed in this analysis which include the effects of direct heating on containment loading, and the effects of fission product heating and natural convection on releases from the primary system. The results indicate that direct heating which involves more than about 50% of the core can fail the Bellefonte containment, but natural convection in the RCS may lead to overheating and failure of the primary system piping before core slump, thus, eliminating or mitigating direct heating. Releases from the primary system are significantly increased before vessel breach due to natural circulation and after vessel breach due to reevolution of retained fission products by fission product heating of RCS structures.

  1. Analysis of the SL-1 Accident Using RELAPS5-3D

    SciTech Connect

    Francisco, A.D. and Tomlinson, E. T.

    2007-11-08

    On January 3, 1961, at the National Reactor Testing Station, in Idaho Falls, Idaho, the Stationary Low Power Reactor No. 1 (SL-1) experienced a major nuclear excursion, killing three people, and destroying the reactor core. The SL-1 reactor, a 3 MW{sub t} boiling water reactor, was shut down and undergoing routine maintenance work at the time. This paper presents an analysis of the SL-1 reactor excursion using the RELAP5-3D thermal-hydraulic and nuclear analysis code, with the intent of simulating the accident from the point of reactivity insertion to destruction and vaporization of the fuel. Results are presented, along with a discussion of sensitivity to some reactor and transient parameters (many of the details are only known with a high level of uncertainty).

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  5. Learning from the Piper Alpha accident: A postmortem analysis of technical and organizational factors

    SciTech Connect

    Pate-Cornell, M.E. )

    1993-04-01

    The accident that occurred on board the offshore platform Piper Alpha in July 1988 killed 167 people and cost billions of dollars in property damage. It was caused by a massive fire, which was not the result of an unpredictable act of God' but of an accumulation of errors and questionable decisions. Most of them were rooted in the organization, its structure, procedures, and culture. This paper analyzes the accident scenario using the risk analysis framework, determines which human decision and actions influenced the occurrence of the basic events, and then identifies the organizational roots of these decisions and actions. These organizational factors are generalizable to other industries and engineering systems. They include flaws in the design guidelines and design practices (e.g., tight physical couplings or insufficient redundancies), misguided priorities in the management of the tradeoff between productivity and safety, mistakes in the management of the personnel on board, and errors of judgement in the process by which financial pressures are applied on the production sector (i.e., the oil companies' definition of profit centers) resulting in deficiencies in inspection and maintenance operations. This analytical approach allows identification of risk management measures that go beyond the purely technical (e.g., add redundancies to a safety system) and also include improvements of management practices. 18 refs., 4 figs.

  6. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  8. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  9. Accident analysis and control options in support of the sludge water system safety analysis

    SciTech Connect

    HEY, B.E.

    2003-01-16

    A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have been incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.

  10. Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.

    PubMed

    Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis

    2015-01-01

    Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended. PMID:25179119

  11. The Analysis of the Contribution of Human Factors to the In-Flight Loss of Control Accidents

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2012-01-01

    In-flight loss of control (LOC) is currently the leading cause of fatal accidents based on various commercial aircraft accident statistics. As the Next Generation Air Transportation System (NextGen) emerges, new contributing factors leading to LOC are anticipated. The NASA Aviation Safety Program (AvSP), along with other aviation agencies and communities are actively developing safety products to mitigate the LOC risk. This paper discusses the approach used to construct a generic integrated LOC accident framework (LOCAF) model based on a detailed review of LOC accidents over the past two decades. The LOCAF model is comprised of causal factors from the domain of human factors, aircraft system component failures, and atmospheric environment. The multiple interdependent causal factors are expressed in an Object-Oriented Bayesian belief network. In addition to predicting the likelihood of LOC accident occurrence, the system-level integrated LOCAF model is able to evaluate the impact of new safety technology products developed in AvSP. This provides valuable information to decision makers in strategizing NASA's aviation safety technology portfolio. The focus of this paper is on the analysis of human causal factors in the model, including the contributions from flight crew and maintenance workers. The Human Factors Analysis and Classification System (HFACS) taxonomy was used to develop human related causal factors. The preliminary results from the baseline LOCAF model are also presented.

  12. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  13. Forensic Analysis using Geological and Geochemical Techniques

    NASA Astrophysics Data System (ADS)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  14. Modeling & analysis of criticality-induced severe accidents during refueling for the Advanced Neutron Source Reactor

    SciTech Connect

    Georgevich, V.; Kim, S.H.; Taleyarkhan, R.P.; Jackson, S.

    1992-10-01

    This paper describes work done at the Oak Ridge National Laboratory (ORNL) for evaluating the potential and resulting consequences of a hypothetical criticality accident during refueling of the 330-MW Advanced Neutron Source (ANS) research reactor. The development of an analytical capability is described. Modeling and problem formulation were conducted using concepts of reactor neutronic theory for determining power level escalation, coupled with ORIGEN and MELCOR code simulations for radionuclide buildup and containment transport Gaussian plume transport modeling was done for determining off-site radiological consequences. Nuances associated with modeling this blast-type scenario are described. Analysis results for ANS containment response under a variety of postulated scenarios and containment failure modes are presented. It is demonstrated that individuals at the reactor site boundary will not receive doses beyond regulatory limits for any of the containment configurations studied.

  15. Chiropractic treatment of patients in motor vehicle accidents: a statistical analysis

    PubMed Central

    Dies, Stephen; Strapp, J Walter

    1992-01-01

    Motor vehicle accidents (MVA) are a major cause of spinal injuries treated by chiropractors. In this study the files of one chiropractor were reviewed retrospectively to generate a data base on the MVA cases (n = 149). The effect of age, sex, vehicle damage, symptoms and concurrent physiotherapy on the dependent variables of number of treatments, improvement and requirement for ongoing treatment was computed using an analysis of variance. Overall the average number of treatments given was 14.2. Patients who complained of headache or low back pain required more treatments than average. Improvement level was lowered by delay in seeking treatment, the presence of uncomplicated nausea and advancing age. Ongoing treatment to relieve persistent pain was required in 40.2 percent of the cases. None of the factors studied had a significant effect on this variable. The results of this study are comparable to those reported in the medical literature.

  16. Conceptual design loss-of-coolant accident analysis for the Advanced Neutron Source reactor

    SciTech Connect

    Chen, N.C.J.; Wendel, M.W.; Yoder, G.L. Jr. )

    1994-01-01

    A RELAP5 system model for the Advanced Neutron Source Reactor has been developed for performing conceptual safety analysis report calculations. To better represent thermal-hydraulic behavior of the core, three specific changes in the RELAP5 computer code were implemented: a turbulent forced-convection heat transfer correlation, a critical heat flux (CHF) correlation, and an interfacial drag correlation. The model consists of the core region, the heat exchanger loop region, and the pressurizing/letdown system region. Results for three loss-of-coolant accident analyses are presented: (1) an instantaneous double-ended guillotine (DEG) core outlet break with a cavitating venturi installed downstream of the core, (b) a core pressure boundary tube outer wall rupture, and (c) a DEG core inlet break with a finite break-formation time. The results show that the core can survive without exceeding the flow excursion of CHF thermal limits at a 95% probability level if the proper mitigation options are provided.

  17. Multifractal analysis of the 137Cs fallout pattern in Austria resulting from the Chernobyl accident.

    PubMed

    Pausch, G; Bossew, P; Hofmann, W; Steger, F

    1998-06-01

    The cumulative deposition of the 137Cs fallout in Austria resulting from the passage of the Chernobyl cloud has been investigated by applying correlation dimension and hyperbolic frequency distribution methods. For the analysis, a total of 1,881 deposition values were used, which were collected by the Federal Environmental Agency of Austria and the Federal Ministry of Health, representing all available measurements of 137Cs in soil made in Austria after the Chernobyl accident. From these data a hyperbolic exponent for the frequency distribution of 4.0 and a set of fractal correlation dimensions, which decrease from 1.426 +/- 0.022 (for the whole network) to 0.706 +/- 0.047 (for 137Cs values > or = 100 kBq m(-2)), were derived, thus confirming that the fallout pattern can be described as a multifractal. PMID:9600299

  18. PTSD Symptom Severity and Psychiatric Comorbidity in Recent Motor Vehicle Accident Victims: A Latent Class Analysis

    PubMed Central

    Hruska, Bryce; Irish, Leah A.; Pacella, Maria L.; Sledjeski, Eve M.; Delahanty, Douglas L.

    2014-01-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  19. PTSD symptom severity and psychiatric comorbidity in recent motor vehicle accident victims: a latent class analysis.

    PubMed

    Hruska, Bryce; Irish, Leah A; Pacella, Maria L; Sledjeski, Eve M; Delahanty, Douglas L

    2014-10-01

    We conducted a latent class analysis (LCA) on 249 recent motor vehicle accident (MVA) victims to examine subgroups that differed in posttraumatic stress disorder (PTSD) symptom severity, current major depressive disorder and alcohol/other drug use disorders (MDD/AoDs), gender, and interpersonal trauma history 6-weeks post-MVA. A 4-class model best fit the data with a resilient class displaying asymptomatic PTSD symptom levels/low levels of comorbid disorders; a mild psychopathology class displaying mild PTSD symptom severity and current MDD; a moderate psychopathology class displaying severe PTSD symptom severity and current MDD/AoDs; and a severe psychopathology class displaying extreme PTSD symptom severity and current MDD. Classes also differed with respect to gender composition and history of interpersonal trauma experience. These findings may aid in the development of targeted interventions for recent MVA victims through the identification of subgroups distinguished by different patterns of psychiatric problems experienced 6-weeks post-MVA. PMID:25124501

  20. Visualization of Traffic Accidents

    NASA Technical Reports Server (NTRS)

    Wang, Jie; Shen, Yuzhong; Khattak, Asad

    2010-01-01

    Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.

  1. Robustness of reliability-growth analysis techniques

    NASA Astrophysics Data System (ADS)

    Ellis, Karen E.

    The author examines the robustness of techniques commonly applied to failure time data to determine if the failure rate (1/mean-time-between-failures) is changing over time. The models examined are the Duane postulate, Crow-Army Material Systems Analysis Activity, and Kalman filtering (also referred to as dynamic linear modeling). Each has as a foundation the underlying premise of changing failure rate over time. The techniques seek to confirm or reject whether failure rate is changing significantly, based on observed data. To compare the ability of each method to accomplish such a rejection or confirmation, a known failure time distribution is simulated, and then each model is applied and results are compared.

  2. Final safety analysis report for the Galileo Mission: Volume 2: Book 1, Accident model document

    SciTech Connect

    Not Available

    1988-12-15

    The Accident Model Document (AMD) is the second volume of the three volume Final Safety Analysis Report (FSAR) for the Galileo outer planetary space science mission. This mission employs Radioisotope Thermoelectric Generators (RTGs) as the prime electrical power sources for the spacecraft. Galileo will be launched into Earth orbit using the Space Shuttle and will use the Inertial Upper Stage (IUS) booster to place the spacecraft into an Earth escape trajectory. The RTG's employ silicon-germanium thermoelectric couples to produce electricity from the heat energy that results from the decay of the radioisotope fuel, Plutonium-238, used in the RTG heat source. The heat source configuration used in the RTG's is termed General Purpose Heat Source (GPHS), and the RTG's are designated GPHS-RTGs. The use of radioactive material in these missions necessitates evaluations of the radiological risks that may be encountered by launch complex personnel as well as by the Earth's general population resulting from postulated malfunctions or failures occurring in the mission operations. The FSAR presents the results of a rigorous safety assessment, including substantial analyses and testing, of the launch and deployment of the RTGs for the Galileo mission. This AMD is a summary of the potential accident and failure sequences which might result in fuel release, the analysis and testing methods employed, and the predicted source terms. Each source term consists of a quantity of fuel released, the location of release and the physical characteristics of the fuel released. Each source term has an associated probability of occurrence. 27 figs., 11 tabs.

  3. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  4. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  5. COSIMA data analysis using multivariate techniques

    NASA Astrophysics Data System (ADS)

    Silén, J.; Cottin, H.; Hilchenbach, M.; Kissel, J.; Lehto, H.; Siljeström, S.; Varmuza, K.

    2015-02-01

    We describe how to use multivariate analysis of complex TOF-SIMS (time-of-flight secondary ion mass spectrometry) spectra by introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a cross-validation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  6. A comparative analysis of accident risks in fossil, hydro, and nuclear energy chains

    SciTech Connect

    Burgherr, P.; Hirschberg, S.

    2008-07-01

    This study presents a comparative assessment of severe accident risks in the energy sector, based on the historical experience of fossil (coal, oil, natural gas, and LPG (Liquefied Petroleum Gas)) and hydro chains contained in the comprehensive Energy-related Severe Accident Database (ENSAD), as well as Probabilistic Safety Assessment (PSA) for the nuclear chain. Full energy chains were considered because accidents can take place at every stage of the chain. Comparative analyses for the years 1969-2000 included a total of 1870 severe ({>=} 5 fatalities) accidents, amounting to 81,258 fatalities. Although 79.1% of all accidents and 88.9% of associated fatalities occurred in less developed, non-OECD countries, industrialized OECD countries dominated insured losses (78.0%), reflecting their substantially higher insurance density and stricter safety regulations. Aggregated indicators and frequency-consequence (F-N) curves showed that energy-related accident risks in non-OECD countries are distinctly higher than in OECD countries. Hydropower in non-OECD countries and upstream stages within fossil energy chains are most accident-prone. Expected fatality rates are lowest for Western hydropower and nuclear power plants; however, the maximum credible consequences can be very large. Total economic damages due to severe accidents are substantial, but small when compared with natural disasters. Similarly, external costs associated with severe accidents are generally much smaller than monetized damages caused by air pollution.

  7. Analysis of the Uniform Accident And Sickness Policy Provision Law: lessons for social work practice, policy, and research.

    PubMed

    Cochran, Gerald

    2010-01-01

    The Uniform Accident and Sickness Policy Provision Law (UPPL) is a state statute that allows insurance companies in 26 states to deny claims for accidents and injuries incurred by persons under the influence of drugs or alcohol. Serious repercussions can result for patients and health care professionals as states enforce this law. To examine differences within the laws that might facilitate amendments or reduce insurance companies' ability to deny claims, a content analysis was carried out of each state's UPPL law. Results showed no meaningful differences between each state's laws. These results indicate patients and health professionals share similar risk related to the UPPL regardless of state. PMID:20711944

  8. Defense In-Depth Accident Analysis Evaluation of Tritium Facility Bldgs. 232-H, 233-H, and 234-H

    SciTech Connect

    Blanchard, A.

    1999-05-10

    'The primary purpose of this report is to document a Defense-in-Depth (DID) accident analysis evaluation for Department of Energy (DOE) Savannah River Site (SRS) Tritium Facility Buildings 232-H, 233-H, and 234-H. The purpose of a DID evaluation is to provide a more realistic view of facility radiological risks to the offsite public than the bounding deterministic analysis documented in the Safety Analysis Report, which credits only Safety Class items in the offsite dose evaluation.'

  9. Application of Electromigration Techniques in Environmental Analysis

    NASA Astrophysics Data System (ADS)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  10. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant.

    PubMed

    Camplani, M; Malizia, A; Gelfusa, M; Barbato, F; Antonelli, L; Poggi, L A; Ciparisse, J F; Salgado, L; Richetta, M; Gaudio, P

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach. PMID:26827318

  11. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant

    NASA Astrophysics Data System (ADS)

    Camplani, M.; Malizia, A.; Gelfusa, M.; Barbato, F.; Antonelli, L.; Poggi, L. A.; Ciparisse, J. F.; Salgado, L.; Richetta, M.; Gaudio, P.

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  12. A numerical comparison of sensitivity analysis techniques

    SciTech Connect

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  13. Analysis of fission product revaporization in a BWR Reactor Coolant System during a station blackout accident

    SciTech Connect

    Yang, J.W.; Schmidt, E.; Cazzoli, E.; Khatib-Rahbar, M.

    1988-01-01

    This paper presents an analysis of fission product revaporization from the Reactor Coolant System (RCS) following the Reactor Pressure Vessel (RPV) failure. The station blackout accident in a BWR Mark I Power Plant was considered. The TRAPMELT3 models for vaporization, chemisorption, and the decay heating of RCS structures and gases were used and extended beyond the RPV failure in the analysis. The RCS flow models based on the density-difference or pressure-difference between the RCS and containment pedestal region were developed to estimate the RCS outflow which carries the revaporized fission product to the containment. A computer code called REVAP was developed for the analysis. The REVAP code was incorporated with the MARCH, TRAPMELT3 and NAUA codes from the Source Term Code Package (STCP) to estimate the impact of revaporization on environmental release. The results show that the thermal-hydraulic conditions between the RCS and the pedestal region are important factors in determining the magnitude of revaporization and subsequent release of the volatile fission product into the environment. 6 refs., 8 figs.

  14. TRACE/PARCS Core Modeling of a BWR/5 for Accident Analysis of ATWS Events

    SciTech Connect

    Cuadra A.; Baek J.; Cheng, L.; Aronson, A.; Diamond, D.; Yarsky, P.

    2013-11-10

    The TRACE/PARCS computational package [1, 2] isdesigned to be applicable to the analysis of light water reactor operational transients and accidents where the coupling between the neutron kinetics (PARCS) and the thermal-hydraulics and thermal-mechanics (TRACE) is important. TRACE/PARCS has been assessed for itsapplicability to anticipated transients without scram(ATWS) [3]. The challenge, addressed in this study, is to develop a sufficiently rigorous input model that would be acceptable for use in ATWS analysis. Two types of ATWS events were of interest, a turbine trip and a closure of main steam isolation valves (MSIVs). In the first type, initiated by turbine trip, the concern is that the core will become unstable and large power oscillations will occur. In the second type,initiated by MSIV closure,, the concern is the amount of energy being placed into containment and the resulting emergency depressurization. Two separate TRACE/PARCS models of a BWR/5 were developed to analyze these ATWS events at MELLLA+ (maximum extended load line limit plus)operating conditions. One model [4] was used for analysis of ATWS events leading to instability (ATWS-I);the other [5] for ATWS events leading to emergency depressurization (ATWS-ED). Both models included a large portion of the nuclear steam supply system and controls, and a detailed core model, presented henceforth.

  15. The Wheels of Misfortune: A Time Series Analysis of Bicycle Accidents on a College Campus.

    ERIC Educational Resources Information Center

    Johnson, Mark S.; And Others

    1978-01-01

    The effectiveness of engineering and policy interventions in reducing bicycle accidents on the campus of the University of California at Santa Barbara was investigated. None of the bikeway modifications were found to have been effective in reducing bicycle accidents. (Author/GDC)

  16. 3W approach to the investigation, analysis, and prevention of human-error aircraft accidents.

    PubMed

    Ricketson, D S; Brown, W R; Graham, K N

    1980-09-01

    Human error is the largest cause of U.S. Army aircraft accidents. An approach to this problem is presented which is based on a model of the human-error accident. This 3W approach identifies what task error (TE) caused or contributed to the accident, what inadequacy (I) in the aviation system caused or allowed the TE to occur, and what remedial measure (R) is required to correct the I. There were 82 human-error accidents analyzed to identify TEIR information. Statistically important is were identified which could be remedied based on accident costs. Then, potentially cost-effective remedial actions were ranked on a cost-benefit totem pole. The totem pole was given to the aviation system manager as a management tool to assist in determining priorities for corrective actions. PMID:7417175

  17. Emergency drinking water treatment during source water pollution accidents in China: origin analysis, framework and technologies.

    PubMed

    Zhang, Xiao-Jian; Chen, Chao; Lin, Peng-Fei; Hou, Ai-Xin; Niu, Zhang-Bin; Wang, Jun

    2011-01-01

    China has suffered frequent source water contamination accidents in the past decade, which has resulted in severe consequences to the water supply of millions of residents. The origins of typical cases of contamination are discussed in this paper as well as the emergency response to these accidents. In general, excessive pursuit of rapid industrialization and the unreasonable location of factories are responsible for the increasing frequency of accidental pollution events. Moreover, insufficient attention to environmental protection and rudimentary emergency response capability has exacerbated the consequences of such accidents. These environmental accidents triggered or accelerated the promulgation of stricter environmental protection policy and the shift from economic development mode to a more sustainable direction, which should be regarded as the turning point of environmental protection in China. To guarantee water security, China is trying to establish a rapid and effective emergency response framework, build up the capability of early accident detection, and develop efficient technologies to remove contaminants from water. PMID:21133359

  18. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  19. Development of NASA's Accident Precursor Analysis Process Through Application on the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Maggio, Gaspare; Groen, Frank; Hamlin, Teri; Youngblood, Robert

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system. APA docs more than simply track experience: it systematically evaluates experience, looking for under-appreciated risks that may warrant changes to design or operational practice. This paper presents the pilot application of the NASA APA process to Space Shuttle Orbiter systems. In this effort, the working sessions conducted at Johnson Space Center (JSC) piloted the APA process developed by Information Systems Laboratories (ISL) over the last two years under the auspices of NASA's Office of Safety & Mission Assurance, with the assistance of the Safety & Mission Assurance (S&MA) Shuttle & Exploration Analysis Branch. This process is built around facilitated working sessions involving diverse system experts. One important aspect of this particular APA process is its focus on understanding the physical mechanism responsible for an operational anomaly, followed by evaluation of the risk significance of the observed anomaly as well as consideration of generalizations of the underlying mechanism to other contexts. Model completeness will probably always be an issue, but this process tries to leverage operating experience to the extent possible in order to address completeness issues before a catastrophe occurs.

  20. Overview of the Aerothermodynamics Analysis Conducted in Support of the STS-107 Accident Investigation

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.

    2004-01-01

    A graphic presentation of the aerothermodynamics analysis conducted in support of the STS-107 accident investigation. Investigation efforts were conducted as part of an integrated AATS team (Aero, Aerothermal, Thermal, Stress) directed by OVEWG. Graphics presented are: STS-107 Entry trajectory and timeline (1st off-nominal event to Post-LOS); Indications from OI telemetry data; Aero/aerothermo/thermal analysis process; Selected STS-107 side fuselage/OMS pod off-nominal temperatures; Leading edge structural subsystem; Relevant forensics evidence; External aerothermal environments; STS-107 Pre-entry EOM3 heating profile; Surface heating and temperatures; Orbiter wing leading edge damage survey; Internal aerothermal environments; Orbiter wing CAD model; Aerodynamic flight reconstruction; Chronology of aerodynamic/aerothermoydynamic contributions; Acreage TPS tile damage; Larger OML perturbations; Missing RCC panel(s); Localized damage to RCC panel/missing T-seal; RCC breach with flow ingestion; and Aero-aerothermal closure. NAIT served as the interface between the CAIB and NASA investigation teams; and CAIB requests for study were addressed.

  1. Analysis of offsite Emergency Planning Zones (EPZs) for the Rocky Flats Plant. Phase 3, Sitewide spectrum-of-accidents and bounding EPZ analysis

    SciTech Connect

    Petrocchi, A.J.; Zimmerman, G.A.

    1994-03-14

    During Phase 3 of the EPZ project, a sitewide analysis will be performed applying a spectrum-of-accidents approach to both radiological and nonradiological hazardous materials release scenarios. This analysis will include the MCA but will be wider in scope and will produce options for the State of Colorado for establishing a bounding EPZ that is intended to more comprehensively update the interim, preliminary EPZ developed in Phase 2. EG&G will propose use of a hazards assessment methodology that is consistent with the DOE Emergency Management Guide for Hazards Assessments and other methods required by DOE orders. This will include hazards, accident, safety, and risk analyses. Using this methodology, EG&G will develop technical analyses for a spectrum of accidents. The analyses will show the potential effects from the spectrum of accidents on the offsite population together with identification of offsite vulnerable zones and areas of concern. These analyses will incorporate state-of-the-art technology for accident analysis, atmospheric plume dispersion modeling, consequence analysis, and the application of these evaluations to the general public population at risk. The analyses will treat both radiological and nonradiological hazardous materials and mixtures of both released accidentally to the atmosphere. DOE/RFO will submit these results to the State of Colorado for the State`s use in determining offsite emergency planning zones for the Rocky Flats Plant. In addition, the results will be used for internal Rocky Flats Plant emergency planning.

  2. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management

    NASA Astrophysics Data System (ADS)

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8 % of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced.

  3. Analysis of Surface Water Pollution Accidents in China: Characteristics and Lessons for Risk Management.

    PubMed

    Yao, Hong; Zhang, Tongzhu; Liu, Bo; Lu, Feng; Fang, Shurong; You, Zhen

    2016-04-01

    Understanding historical accidents is important for accident prevention and risk mitigation; however, there are no public databases of pollution accidents in China, and no detailed information regarding such incidents is readily available. Thus, 653 representative cases of surface water pollution accidents in China were identified and described as a function of time, location, materials involved, origin, and causes. The severity and other features of the accidents, frequency and quantities of chemicals involved, frequency and number of people poisoned, frequency and number of people affected, frequency and time for which pollution lasted, and frequency and length of pollution zone were effectively used to value and estimate the accumulated probabilities. The probabilities of occurrences of various types based on origin and causes were also summarized based on these observations. The following conclusions can be drawn from these analyses: (1) There was a high proportion of accidents involving multi-district boundary regions and drinking water crises, indicating that more attention should be paid to environmental risk prevention and the mitigation of such incidents. (2) A high proportion of accidents originated from small-sized chemical plants, indicating that these types of enterprises should be considered during policy making. (3) The most common cause (49.8% of the total) was intentional acts (illegal discharge); accordingly, efforts to increase environmental consciousness in China should be enhanced. PMID:26739714

  4. An analysis of civil aviation propeller-to-person accidents: 1965-79.

    PubMed

    Collins, W E; Mastrullo, A R; Kirkham, W R; Taylor, D K; Grape, P M

    1982-05-01

    The interest of manufacturing, governmental, and safety personnel in using paint schemes on propeller and rotor blades is based on improving the visual conspicuity of those blades when they are rotating. While propeller and rotor paint schemes may serve to reduce the number of fatalities and injuries due to contact with a rotating blade, there is little information about the circumstances surrounding such accidents. Brief reports provided by the National Transportation Safety Board of all "propeller-to-person" accidents from 1965-79 were examined and analyzed in terms of airport lighting conditions, actions of pilots, actions of passengers and groundcrew, phase of flight operations, weather conditions, and others. Analyses based on 319 accidents showed a marked drop in the frequency of "propeller-to-person" accidents from 1975 through 1978. Several types of educational efforts directed toward pilots and groundcrew, both prior to and during that 4-year period, were examined as possible factors contributing to the accident rate decline. Accident patterns provide a basis for assessing the probable efficacy of various recommendations, including propeller conspicuity, for further reducing "propeller-to-person" accidents. PMID:7092754

  5. The effect of gamma-ray transport on afterheat calculations for accident analysis

    SciTech Connect

    Reyes, S.; Latkowski, J.F.; Sanz, J.

    2000-05-01

    Radioactive afterheat is an important source term for the release of radionuclides in fusion systems under accident conditions. Heat transfer calculations are used to determine time-temperature histories in regions of interest, but the true source term needs to be the effective afterheat, which considers the transport of penetrating gamma rays. Without consideration of photon transport, accident temperatures may be overestimated in others. The importance of this effect is demonstrated for a simple, one-dimensional problem. The significance of this effect depends strongly on the accident scenario being analyzed.

  6. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network.

    PubMed

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae

    2016-08-01

    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. PMID:27183516

  7. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use...

  8. Lower head creep rupture failure analysis associated with alternative accident sequences of the Three Mile Island Unit 2

    SciTech Connect

    Sang Lung, Chan

    2004-07-01

    The objective of this lower head creep rupture analysis is to assess the current version of MELCOR 1.8.5-RG against SCDAP/RELAP5 MOD 3.3kz. The purpose of this assessment is to investigate the current MELCOR in-vessel core damage progression phenomena including the model for the formation of a molten pool. The model for stratified molten pool natural heat transfer will be included in the next MELCOR release. Presently, MELCOR excludes the gap heat-transfer model for the cooling associated with the narrow gap between the debris and the lower head vessel wall. All these phenomenological models are already treated in SCDAP/RELAP5 using the COUPLE code to model the heat transfer of the relocated debris with the lower head based on a two-dimensional finite-element-method. The assessment should determine if current MELCOR capabilities adequately cover core degradation phenomena appropriate for the consolidated MELCOR code. Inclusion of these features should bring MELCOR much closer to a state of parity with SCDAP/RELAP5 and is a currently underway element in the MELCOR code consolidation effort. This assessment deals with the following analysis of the Three Mile Island Unit 2 (TMI-2) alternative accident sequences. The TMI-2 alternative accident sequence-1 includes the continuation of the base case of the TMI-2 accident with the Reactor Coolant Pumps (RCP) tripped, and the High Pressure Injection System (HPIS) throttled after approximately 6000 s accident time, while in the TMI-2 alternative accident sequence-2, the reactor coolant pumps is tripped after 6000 s and the HPIS is activated after 12,012 s. The lower head temperature distributions calculated with SCDAP/RELAP5 are visualized and animated with open source visualization freeware 'OpenDX'. (author)

  9. An analysis of the consequences of accidents involving shipments of multiple Type A radioactive material (RAM) packages

    SciTech Connect

    Finley, N.C.; McClure, J.D.; Reardon, P.C.; Wangler, M.

    1989-01-01

    Comparing the results of the RADTRANIII calculations with a normalized set of results, both for incident-free transport and vehicular accident cases, the calculated consequences in the current analysis are lower. Even for the High-Activity Shipment, the total expected population dose from either incident-free transport or vehicular accidents is small, and smaller than that estimated in USNRC 1977. The results of the simulation in which parameters were varied randomly and independently indicate that, regardless of the input values assumed, the maximum total population dose from the High-Activity Shipment and the simultaneous occurrence of the least conservative value for each input parameter might be as high as 300 person-rem for a single shipment. The values for either of the other shipments (DOT Exemption or Common Carrier) would be significantly lower. The potential average individual radiation doses from accidents involving multiple Type A package shipments are comparable to the increase in the normal background radiation dose of 0.09 rem/person/year (90 mrem) that an individual would receive by moving from sea level to 5000 ft elevation. The maximum dose to an individual (one very near the accident scene) for the High Activity Shipment would be approximately 0.3 rem (300 mrem) in a maximum severity accident. This is within the individual dose guidelines outlined by NCRP (0.5 rem). Even at the high levels postulated for multiple package shipments under DOT controlled exemptions, the potential risks to the public in terms of expected population dose in the current analysis are below those already found to be acceptable. 4 refs., 3 tabs.

  10. Rapid Disaster Analysis based on SAR Techniques

    NASA Astrophysics Data System (ADS)

    Yang, C. H.; Soergel, U.

    2015-03-01

    Due to all-day and all-weather capability spaceborne SAR is a valuable means for rapid mapping during and after disaster. In this paper, three change detection techniques based on SAR data are discussed: (1) initial coarse change detection, (2) flooded area detection, and (3) linear-feature change detection. The 2011 Tohoku Earthquake and Tsunami is used as case study, where earthquake and tsunami events provide a complex case for this study. In (1), pre- and post-event TerraSAR-X images are coregistered accurately to produce a false-color image. Such image provides a quick and rough overview of potential changes, which is useful for initial decision making and identifies areas worthwhile to be analysed further in more depth. In (2), the post-event TerraSAR-X image is used to extract the flooded area by morphological approaches. In (3), we are interested in detecting changes of linear shape as indicator for modified man-made objects. Morphological approaches, e.g. thresholding, simply extract pixel-based changes in the difference image. However, in this manner many irrelevant changes are highlighted, too (e.g., farming activity, speckle). In this study, Curvelet filtering is applied in the difference image not only to suppress false alarms but also to enhance the change signals of linear-feature form (e.g. buildings) in settlements. Afterwards, thresholding is conducted to extract linear-shaped changed areas. These three techniques mentioned above are designed to be simple and applicable in timely disaster analysis. They are all validated by comparing with the change map produced by Center for Satellite Based Crisis Information, DLR.

  11. Analysis of labour accidents in tunnel construction and introduction of prevention measures.

    PubMed

    Kikkawa, Naotaka; Itoh, Kazuya; Hori, Tomohito; Toyosawa, Yasuo; Orense, Rolando P

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  12. Source terms for analysis of accidents at a high level waste repository

    SciTech Connect

    Mubayi, V.; Davis, R.E.; Youngblood, R.

    1989-01-01

    This paper describes an approach to identifying source terms from possible accidents during the preclosure phase of a high-level nuclear waste repository. A review of the literature on repository safety analyses indicated that source term estimation is in a preliminary stage, largely based on judgement-based scoping analyses. The approach developed here was to partition the accident space into domains defined by certain threshold values of temperature and impact energy density which may arise in potential accidents and specify release fractions of various radionuclides, present in the waste form, in each domain. Along with a more quantitative understanding of accident phenomenology, this approach should help in achieving a clearer perspective on scenarios important to preclosure safety assessments of geologic repositories. 18 refs., 3 tabs.

  13. Traffic Analysis and Road Accidents: A Case Study of Hyderabad using GIS

    NASA Astrophysics Data System (ADS)

    Bhagyaiah, M.; Shrinagesh, B.

    2014-06-01

    Globalization has impacted many developing countries across the world. India is one such country, which benefited the most. Increased, economic activity raised the consumption levels of the people across the country. This created scope for increase in travel and transportation. The increase in the vehicles since last 10 years has put lot of pressure on the existing roads and ultimately resulting in road accidents. It is estimated that since 2001 there is an increase of 202 percent of two wheeler and 286 percent of four wheeler vehicles with no road expansion. Motor vehicle crashes are a common cause of death, disability and demand for emergency medical care. Globally, more than 1 million people die each year from traffic crashes and about 20-50 million are injured or permanently disabled. There has been increasing trend in road accidents in Hyderabad over a few years. GIS helps in locating the accident hotspots and also in analyzing the trend of road accidents in Hyderabad.

  14. Analysis of labour accidents in tunnel construction and introduction of prevention measures

    PubMed Central

    KIKKAWA, Naotaka; ITOH, Kazuya; HORI, Tomohito; TOYOSAWA, Yasuo; ORENSE, Rolando P.

    2015-01-01

    At present, almost all mountain tunnels in Japan are excavated and constructed utilizing the New Austrian Tunneling Method (NATM), which was advocated by Prof. Rabcewicz of Austria in 1964. In Japan, this method has been applied to tunnel construction since around 1978, after which there has been a subsequent decrease in the number of casualties during tunnel construction. However, there is still a relatively high incidence of labour accidents during tunnel construction when compared to incidence rates in the construction industry in general. During tunnel construction, rock fall events at the cutting face are a particularly characteristic of the type of accident that occurs. In this study, we analysed labour accidents that possess the characteristics of a rock fall event at a work site. We also introduced accident prevention measures against rock fall events. PMID:26027707

  15. Sensitivity analysis of a ship accident at a deep-ocean site in the northwest Atlantic

    SciTech Connect

    Kaplan, M.F.

    1985-04-01

    This report presents the results of a sensitivity analysis for an HLW ship accident occurring in the Nares Abyssal Plain in the northwestern Atlantic. Waste form release rate, canister lifetime and sorption in the water column (partition coefficients) were varied. Also investigated were the relative importance of the dose from the food chain and from seaweed in the diet. Peak individual doses and integrated collective doses for populations were the units of comparison. In accordance with international guidelines on radiological protection, the comparisons of different options were carried out over ''all time''; the study uses a million-year time frame. Partition coefficients have the most pronounced effect on collective dose of the parameters studied. Variations in partition coefficients affect the shape of the collective dose curve over the entire time frame. Peak individual doses decrease markedly when the value for the sorption of americium is increased, but show no increase when less sorption is assumed. Waste form release rates and canister lifetimes affect collective doses only in periods prior to 20,000 years. Hence, comparisons of these options need not be carried out beyond 20,000 years. Waste from release rates below 10/sup -3//yr (nominal value) affect individual doses in a linear manner, i.e., an order-of-magnitude reduction in release rate leads to an order-of-magnitude reduction in peak individual dose. Little reduction in peak individual doses is seen with canister lifetimes extended beyond the nominal 100 years. 32 refs., 14 figs., 16 tabs.

  16. THERMAL ANALYSIS OF A 9975 PACKAGE IN A FACILITY FIRE ACCIDENT

    SciTech Connect

    Gupta, N.

    2011-02-14

    Surplus plutonium bearing materials in the U.S. Department of Energy (DOE) complex are stored in the 3013 containers that are designed to meet the requirements of the DOE standard DOE-STD-3013. The 3013 containers are in turn packaged inside 9975 packages that are designed to meet the NRC 10 CFR Part 71 regulatory requirements for transporting the Type B fissile materials across the DOE complex. The design requirements for the hypothetical accident conditions (HAC) involving a fire are given in 10 CFR 71.73. The 9975 packages are stored at the DOE Savannah River Site in the K-Area Material Storage (KAMS) facility for long term of up to 50 years. The design requirements for safe storage in KAMS facility containing multiple sources of combustible materials are far more challenging than the HAC requirements in 10 CFR 71.73. While the 10 CFR 71.73 postulates an HAC fire of 1475 F and 30 minutes duration, the facility fire calls for a fire of 1500 F and 86 duration. This paper describes a methodology and the analysis results that meet the design limits of the 9975 component and demonstrate the robustness of the 9975 package.

  17. Identification of Behavior Based Safety by Using Traffic Light Analysis to Reduce Accidents

    NASA Astrophysics Data System (ADS)

    Mansur, A.; Nasution, M. I.

    2016-01-01

    This work present the safety assessment of a case study and describes an important area within the field production in oil and gas industry, namely behavior based safety (BBS). The company set a rigorous BBS and its intervention program that implemented and deployed continually. In this case, observers requested to have discussion and spread a number of determined questions related with work behavior to the workers during observation. Appraisal of Traffic Light Analysis (TLA) as one tools of risk assessment used to determine the estimated score of BBS questionnaire. Standardization of TLA appraisal in this study are based on Regulation of Minister of Labor and Occupational Safety and Health No:PER.05/MEN/1996. The result shown that there are some points under 84%, which categorized in yellow category and should corrected immediately by company to prevent existing bad behavior of workers. The application of BBS expected to increase the safety performance at work time-by-time and effective in reducing accidents.

  18. Accident investigation

    NASA Technical Reports Server (NTRS)

    Laynor, William G. Bud

    1987-01-01

    The National Transportation Safety Board (NTSB) has attributed wind shear as a cause or contributing factor in 15 accidents involving transport-categroy airplanes since 1970. Nine of these were nonfatal; but the other six accounted for 440 lives. Five of the fatal accidents and seven of the nonfatal accidents involved encounters with convective downbursts or microbursts. Of other accidents, two which were nonfatal were encounters with a frontal system shear, and one which was fatal was the result of a terrain induced wind shear. These accidents are discussed with reference to helping the aircraft to avoid the wind shear or if impossible to help the pilot to get through the wind shear.

  19. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  20. Function Analysis and Decomposistion using Function Analysis Systems Technique

    SciTech Connect

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  1. Highway accident severities and the mixed logit model: an exploratory empirical analysis.

    PubMed

    Milton, John C; Shankar, Venky N; Mannering, Fred L

    2008-01-01

    Many transportation agencies use accident frequencies, and statistical models of accidents frequencies, as a basis for prioritizing highway safety improvements. However, the use of accident severities in safety programming has been often been limited to the locational assessment of accident fatalities, with little or no emphasis being placed on the full severity distribution of accidents (property damage only, possible injury, injury)-which is needed to fully assess the benefits of competing safety-improvement projects. In this paper we demonstrate a modeling approach that can be used to better understand the injury-severity distributions of accidents on highway segments, and the effect that traffic, highway and weather characteristics have on these distributions. The approach we use allows for the possibility that estimated model parameters can vary randomly across roadway segments to account for unobserved effects potentially relating to roadway characteristics, environmental factors, and driver behavior. Using highway-injury data from Washington State, a mixed (random parameters) logit model is estimated. Estimation findings indicate that volume-related variables such as average daily traffic per lane, average daily truck traffic, truck percentage, interchanges per mile and weather effects such as snowfall are best modeled as random-parameters-while roadway characteristics such as the number of horizontal curves, number of grade breaks per mile and pavement friction are best modeled as fixed parameters. Our results show that the mixed logit model has considerable promise as a methodological tool in highway safety programming. PMID:18215557

  2. Biomechanical analysis of occupant kinematics in rollover motor vehicle accidents: dynamic spit test.

    PubMed

    Sances, Anthony; Kumaresan, Srirangam; Clarke, Richard; Herbst, Brian; Meyer, Steve

    2005-01-01

    A better understanding of occupant kinematics in rollover accidents helps to advance biomechanical knowledge and to enhance the safety features of motor vehicles. While many rollover accident simulation studies have adopted the static approach to delineate the occupant kinematics in rollover accidents, very few studies have attempted the dynamic approach. The present work was designed to study the biomechanics of restrained occupants during rollover accidents using the steady-state dynamic spit test and to address the importance of keeping the lap belt fastened. Experimental tests were conducted using an anthropometric 50% Hybrid III dummy in a vehicle. The vehicle was rotated at 180 degrees/second and the dummy was restrained using a standard three-point restraint system. The lap belt of the dummy was fastened either by using the cinching latch plate or by locking the retractor. Three configurations of shoulder belt harness were simulated: shoulder belt loose on chest with cinch plate, shoulder belt under the left arm and shoulder belt behind the chest. In all tests, the dummy stayed within the confinement of the vehicle indicating that the securely fastened lap belt holds the dummy with dynamic movement of 3 1/2" to 4". The results show that occupant movement in rollover accidents is least affected by various shoulder harness positions with a securely fastened lap belt. The present study forms a first step in delineating the biomechanics of occupants in rollover accidents. PMID:15850090

  3. Nasal continuous positive airway pressure (nCPAP) treatment for obstructive sleep apnea, road traffic accidents and driving simulator performance: a meta-analysis.

    PubMed

    Antonopoulos, Constantine N; Sergentanis, Theodoros N; Daskalopoulou, Styliani S; Petridou, Eleni Th

    2011-10-01

    We used meta-analysis to synthesize current evidence regarding the effect of nasal continuous positive airway pressure (nCPAP) on road traffic accidents in patients with obstructive sleep apnea (OSA) as well as on their performance in driving simulator. The primary outcomes were real accidents, near miss accidents, and accident-related events in the driving simulator. Pooled odds ratios (ORs), incidence rate ratios (IRRs) and standardized mean differences (SMDs) were appropriately calculated through fixed or random effects models after assessing between-study heterogeneity. Furthermore, risk differences (RDs) and numbers needed to treat (NNTs) were estimated for real and near miss accidents. Meta-regression analysis was performed to examine the effect of moderator variables and publication bias was also evaluated. Ten studies on real accidents (1221 patients), five studies on near miss accidents (769 patients) and six studies on the performance in driving simulator (110 patients) were included. A statistically significant reduction in real accidents (OR=0.21, 95% CI=0.12-0.35, random effects model; IRR=0.45, 95% CI=0.34-0.59, fixed effects model) and near miss accidents (OR=0.09, 95% CI=0.04-0.21, random effects model; IRR=0.23, 95% CI=0.08-0.67, random effects model) was observed. Likewise, a significant reduction in accident-related events was observed in the driving simulator (SMD=-1.20, 95% CI=-1.75 to -0.64, random effects). The RD for real accidents was -0.22 (95% CI=-0.32 to -0.13, random effects), with NNT equal to five patients (95% CI=3-8), whereas for near miss accidents the RD was -0.47 (95% CI=-0.69 to -0.25, random effects), with NNT equal to two patients (95% CI=1-4). For near miss accidents, meta-regression analysis suggested that nCPAP seemed more effective among patients entering the studies with higher baseline accident rates. In conclusion, all three meta-analyses demonstrated a sizeable protective effect of nCPAP on road traffic accidents, both

  4. Proteomic Analysis of Vitreous Biopsy Techniques

    PubMed Central

    Skeie, Jessica M.; Brown, Eric N.; Martinez, Harryl D.; Russell, Stephen R.; Birkholz, Emily S.; Folk, James C.; Boldt, H. Culver; Gehrs, Karen M.; Stone, Edwin M.; Wright, Michael E.; Mahajan, Vinit B.

    2013-01-01

    Purpose To compare vitreous biopsy methods using analysis platforms employed in proteomics biomarker discovery. Methods Vitreous biopsies from 10 eyes were collected sequentially using a 23-gauge needle and a 23-gauge vitreous cutter instrument. Paired specimens were evaluated by UV absorbance spectroscopy, SDS-PAGE, and mass-spectrometry (LC-MS/MS). Results The total protein concentration obtained with a needle and vitrectomy instrument biopsy averaged 1.10 mg/ml (SEM = 0.35) and 1.13 mg/ml (SEM = 0.25), respectively. In eight eyes with low or medium viscidity, there was a very high correlation (R2 = 0.934) between the biopsy methods. When data from two eyes with high viscidity vitreous were included, the correlation was reduced (R2 = 0.704). The molecular weight protein SDS-PAGE profiles of paired needle and vitreous cutter samples were similar, except for a minority of pairs with single band intensity variance. Using LC-MS/MS, equivalent peptides were identified with similar frequencies (R2 ≥ 0.90) in paired samples. Conclusion Proteins and peptides collected from vitreous needle biopsies are nearly equivalent to those obtained from a vitreous cutter instrument. This study suggests both techniques may be used for most proteomic and biomarker discovery studies of vitreoretinal diseases, although a minority of proteins and peptides may differ in concentration. PMID:23095728

  5. Magnetic Analysis Techniques Applied to Desert Varnish

    NASA Technical Reports Server (NTRS)

    Schmidgall, E. R.; Moskowitz, B. M.; Dahlberg, E. D.; Kuhlman, K. R.

    2003-01-01

    Desert varnish is a black or reddish coating commonly found on rock samples from arid regions. Typically, the coating is very thin, less than half a millimeter thick. Previous research has shown that the primary components of desert varnish are silicon oxide clay minerals (60%), manganese and iron oxides (20-30%), and trace amounts of other compounds [1]. Desert varnish is thought to originate when windborne particles containing iron and manganese oxides are deposited onto rock surfaces where manganese oxidizing bacteria concentrate the manganese and form the varnish [4,5]. If desert varnish is indeed biogenic, then the presence of desert varnish on rock surfaces could serve as a biomarker, indicating the presence of microorganisms. This idea has considerable appeal, especially for Martian exploration [6]. Magnetic analysis techniques have not been extensively applied to desert varnish. The only previous magnetic study reported that based on room temperature demagnetization experiments, there were noticeable differences in magnetic properties between a sample of desert varnish and the substrate sandstone [7]. Based upon the results of the demagnetization experiments, the authors concluded that the primary magnetic component of desert varnish was either magnetite (Fe3O4) or maghemite ( Fe2O3).

  6. Preclosure radiological safety analysis for accident conditions of the potential Yucca Mountain Repository: Underground facilities; Yucca Mountain Site Characterization Project

    SciTech Connect

    Ma, C.W.; Sit, R.C.; Zavoshy, S.J.; Jardine, L.J.; Laub, T.W.

    1992-06-01

    This preliminary preclosure radiological safety analysis assesses the scenarios, probabilities, and potential radiological consequences associated with postulated accidents in the underground facility of the potential Yucca Mountain repository. The analysis follows a probabilistic-risk-assessment approach. Twenty-one event trees resulting in 129 accident scenarios are developed. Most of the scenarios have estimated annual probabilities ranging from 10{sup {minus}11}/yr to 10{sup {minus}5}/yr. The study identifies 33 scenarios that could result in offsite doses over 50 mrem and that have annual probabilities greater than 10{sup {minus}9}/yr. The largest offsite dose is calculated to be 220 mrem, which is less than the 500 mrem value used to define items important to safety in 10 CFR 60. The study does not address an estimate of uncertainties, therefore conclusions or decisions made as a result of this report should be made with caution.

  7. Modeling and analysis of the unprotected loss-of-flow accident in the Clinch River Breeder Reactor

    SciTech Connect

    Morris, E.E.; Dunn, F.E.; Simms, R.; Gruber, E.E.

    1985-01-01

    The influence of fission-gas-driven fuel compaction on the energetics resulting from a loss-of-flow accident was estimated with the aid of the SAS3D accident analysis code. The analysis was carried out as part of the Clinch River Breeder Reactor licensing process. The TREAT tests L6, L7, and R8 were analyzed to assist in the modeling of fuel motion and the effects of plenum fission-gas release on coolant and clad dynamics. Special, conservative modeling was introduced to evaluate the effect of fission-gas pressure on the motion of the upper fuel pin segment following disruption. For the nominal sodium-void worth, fission-gas-driven fuel compaction did not adversely affect the outcome of the transient. When uncertainties in the sodium-void worth were considered, however, it was found that if fuel compaction occurs, loss-of-flow driven transient overpower phenomenology could not be precluded.

  8. Retrospection of Chernobyl nuclear accident for decision analysis concerning remedial actions in Ukraine

    SciTech Connect

    Georgievskiy, Vladimir

    2007-07-01

    It is considered the efficacy of decisions concerning remedial actions when of-site radiological monitoring in the early and (or) in the intermediate phases was absent or was not informative. There are examples of such situations in the former Soviet Union where many people have been exposed: releases of radioactive materials from 'Krasnoyarsk-26' into Enisey River, releases of radioactive materials from 'Chelabinsk-65' (the Kishtim accident), nuclear tests at the Semipalatinsk Test Site, the Chernobyl nuclear accident etc. If monitoring in the early and (or) in the intermediate phases is absent the decisions concerning remedial actions are usually developed on the base of permanent monitoring. However decisions of this kind may be essentially erroneous. For these cases it is proposed to make retrospection of radiological data of the early and intermediate phases of nuclear accident and to project decisions concerning remedial actions on the base of both retrospective data and permanent monitoring data. In this Report the indicated problem is considered by the example of the Chernobyl accident for Ukraine. Their of-site radiological monitoring in the early and intermediate phases was unsatisfactory. In particular, the pasture-cow-milk monitoring had not been made. All official decisions concerning dose estimations had been made on the base of measurements of {sup 137}Cs in body (40 measurements in 135 days and 55 measurements in 229 days after the Chernobyl accident). For the retrospection of radiological data of the Chernobyl accident dynamic model has been developed. This model has structure similar to the structure of Pathway model and Farmland model. Parameters of the developed model have been identified for agricultural conditions of Russia and Ukraine. By means of this model dynamics of 20 radionuclides in pathways and dynamics of doses have been estimated for the early, intermediate and late phases of the Chernobyl accident. The main results are following

  9. Testing and analysis of structural integrity of electrosleeved tubes under severe accident transients

    SciTech Connect

    Majumdar, S.

    1999-12-10

    The structural integrity of flawed steam generator tubing with Electrosleeves{trademark} under simulated severe accident transients was analyzed by analytical models that used available material properties data and results from high-temperature tests conducted on Electrosleeved tubes. The Electrosleeve material is almost pure Ni and derives its strength and other useful properties from its nanocrystalline microstructure, which is stable at reactor operating temperatures. However, it undergoes rapid grain growth, at the high temperatures expected during severe accidents, resulting in a loss of strength and a corresponding decrease in flow stress. The magnitude of this decrease depends on the time-temperature history during the accident. Failure tests were conducted at ANL and FTI on internally pressurized Electrosleeved tubes with 80% and 100% throughwall machined axial notches in tie parent tubes that were subjected to simulated severe accident temperature transients. The test results, together with the analytical model, were used to estimate the unaged flow stress curve of the Electrosleeved material at high temperatures. Failure temperatures for Electrosleeved tubes with throughwall and part-throughwall axial cracks of various lengths in the parent tubes were calculated for a postulated severe accident transient.

  10. Accident analysis of large-scale technological disasters applied to an anaesthetic complication.

    PubMed

    Eagle, C J; Davies, J M; Reason, J

    1992-02-01

    The occurrence of serious accidents in complex industrial systems such as at Three Mile Island and Bhopal has prompted development of new models of causation and investigation of disasters. These analytical models have potential relevance in anaesthesia. We therefore applied one of the previously described systems to the investigation of an anaesthetic accident. The model chosen describes two kinds of failures, both of which must be sought. The first group, active failures, consists of mistakes made by practitioners in the provision of care. The second group, latent failures, represents flaws in the administrative and productive system. The model emphasizes the search for latent failures and shows that prevention of active failures alone is insufficient to avoid further accidents if latent failures persist unchanged. These key features and the utility of this model are illustrated by application to a case of aspiration of gastric contents. While four active failures were recognized, an equal number of latent failures also became apparent. The identification of both types of failures permitted the formulation of recommendations to avoid further occurrences. Thus this model of accident causation can provide a useful mechanism to investigate and possibly prevent anaesthetic accidents. PMID:1544192

  11. Analysis of 121 fatal passenger car-adult pedestrian accidents in China.

    PubMed

    Zhao, Hui; Yin, Zhiyong; Yang, Guangyu; Che, Xingping; Xie, Jingru; Huang, Wei; Wang, Zhengguo

    2014-10-01

    To study the characteristics of fatal vehicle-pedestrian accidents in China,a team was established and passenger car-pedestrian crash cases occurring between 2006 and 2011 in Beijing and Chongqing, China were collected. A total of 121 fatal passenger car-adult pedestrian collisions were sampled and analyzed. The pedestrian injuries were scored according to Abbreviated Injury Scale (AIS) and Injury Severity Score (ISS). The demographical distributions of fatal pedestrian accidents differed from other pedestrian accidents. Among the victims, no significant discrepancy in the distribution of ISS and AIS in head, thorax, abdomen, and extremities by pedestrian age was found, while pedestrian behaviors prior to the crashes may affect the ISS. The distributions of AIS in head, thorax, and abdomen among the fatalities did not show any association with impact speeds or vehicle types, whereas there was a strong relationship between the ISS and impact speeds. Whether pedestrians died in the accident field or not was not associated with the ISS or AIS. The present results may be useful for not only forensic experts but also vehicle safety researchers. More investigations regarding fatal pedestrian accidents need be conducted in great detail. PMID:25287805

  12. WASTE-ACC: A computer model for analysis of waste management accidents

    SciTech Connect

    Nabelssi, B.K.; Folga, S.; Kohout, E.J.; Mueller, C.J.; Roglans-Ribas, J.

    1996-12-01

    In support of the U.S. Department of Energy`s (DOE`s) Waste Management Programmatic Environmental Impact Statement, Argonne National Laboratory has developed WASTE-ACC, a computational framework and integrated PC-based database system, to assess atmospheric releases from facility accidents. WASTE-ACC facilitates the many calculations for the accident analyses necessitated by the numerous combinations of waste types, waste management process technologies, facility locations, and site consolidation strategies in the waste management alternatives across the DOE complex. WASTE-ACC is a comprehensive tool that can effectively test future DOE waste management alternatives and assumptions. The computational framework can access several relational databases to calculate atmospheric releases. The databases contain throughput volumes, waste profiles, treatment process parameters, and accident data such as frequencies of initiators, conditional probabilities of subsequent events, and source term release parameters of the various waste forms under accident stresses. This report describes the computational framework and supporting databases used to conduct accident analyses and to develop source terms to assess potential health impacts that may affect on-site workers and off-site members of the public under various DOE waste management alternatives.

  13. APT Blanket System Loss-of-Flow Accident (LOFA) Analysis Based on Initial Conceptual Design - Case 1: with Beam Shutdown and Active RHR

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report.

  14. APT Blanket System Loss-of-Coolant Accident (LOCA) Analysis Based on Initial Conceptual Design - Case 3: External HR Break at Pump Outlet without Pump Trip

    SciTech Connect

    Hamm, L.L.

    1998-10-07

    This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal (HR) system. These simulations were performed for the Preliminary Safety Analysis Report.

  15. Estimating Loss-of-Coolant Accident Frequencies for the Standardized Plant Analysis Risk Models

    SciTech Connect

    S. A. Eide; D. M. Rasmuson; C. L. Atwood

    2008-09-01

    The U.S. Nuclear Regulatory Commission maintains a set of risk models covering the U.S. commercial nuclear power plants. These standardized plant analysis risk (SPAR) models include several loss-of-coolant accident (LOCA) initiating events such as small (SLOCA), medium (MLOCA), and large (LLOCA). All of these events involve a loss of coolant inventory from the reactor coolant system. In order to maintain a level of consistency across these models, initiating event frequencies generally are based on plant-type average performance, where the plant types are boiling water reactors and pressurized water reactors. For certain risk analyses, these plant-type initiating event frequencies may be replaced by plant-specific estimates. Frequencies for SPAR LOCA initiating events previously were based on results presented in NUREG/CR-5750, but the newest models use results documented in NUREG/CR-6928. The estimates in NUREG/CR-6928 are based on historical data from the initiating events database for pressurized water reactor SLOCA or an interpretation of results presented in the draft version of NUREG-1829. The information in NUREG-1829 can be used several ways, resulting in different estimates for the various LOCA frequencies. Various ways NUREG-1829 information can be used to estimate LOCA frequencies were investigated and this paper presents two methods for the SPAR model standard inputs, which differ from the method used in NUREG/CR-6928. In addition, results obtained from NUREG-1829 are compared with actual operating experience as contained in the initiating events database.

  16. Analysis of hospitalization occurred due to motorcycles accidents in São Paulo city

    PubMed Central

    Gorios, Carlos; Armond, Jane de Eston; Rodrigues, Cintia Leci; Pernambuco, Henrique; Iporre, Ramiro Ortiz; Colombo-Souza, Patrícia

    2015-01-01

    OBJECTIVE: To characterize the motorcycle accidents occurred in the city of São Paulo, SP, Brazil in the year 2013, with emphasis on information about hospital admissions from SIH/SUS. METHODS: This is a retrospective cross-sectional study. The study covered 5,597 motorcyclists traumatized in traffic accident during the year 2013 occurred in the city of São Paulo. A survey was conducted using secondary data from the Information System of Hospitalization Health System (SIH/SUS). RESULTS: In 2013, in the city of São Paulo there were 5,597 admissions of motorcyclists traumatized in traffic accidents, of which 89.8% were male. The admission diagnosis were: leg fracture, femur fracture, and intracranial injury. CONCLUSION: This study confirms other preliminary studies on several points, among which stands out the highest prevalence of male young adults. Level of Evidence II, Retrospective Study. PMID:26327804

  17. A Comprehensive Analysis of the X-15 Flight 3-65 Accident

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Orr, Jeb S.; Barshi, Immanuel; Statler, Irving C.

    2014-01-01

    The November 15, 1967, loss of X-15 Flight 3-65-97 (hereafter referred to as Flight 3-65) was a unique incident in that it was the first and only aerospace flight accident involving loss of crew on a vehicle with an adaptive flight control system (AFCS). In addition, Flight 3-65 remains the only incidence of a single-pilot departure from controlled flight of a manned entry vehicle in a hypersonic flight regime. To mitigate risk to emerging aerospace systems, the NASA Engineering and Safety Center (NESC) proposed a comprehensive review of this accident. The goal of the assessment was to resolve lingering questions regarding the failure modes of the aircraft systems (including the AFCS) and thoroughly analyze the interactions among the human agents and autonomous systems that contributed to the loss of the pilot and aircraft. This document contains the outcome of the accident review.

  18. Comparative analysis of social, demographic, and flight-related attributes between accident and nonaccident general aviation pilots.

    PubMed

    Urban, R F

    1984-04-01

    This investigation represents an exploratory examination of several differentiating social and demographic characteristics for a sample of calendar year 1978 Colorado-resident nonfatal accident-involved pilots and a random sample of nonaccident general aviation (i.e., nonairline) pilots. During 1979-1980 80 currently active pilots were interviewed by the author, and information concerning the standard demographic variables, in addition to several social, psychological, and flying-related items, was obtained. The sample was generated from commercially available data files derived from U.S. Government records and consisted of 46 accident and 34 nonaccident pilots who resided within a 100-mi radius of Denver, east of the Rocky Mountains. Descriptively, the respondents represented a broad spectrum of general aviation, including: corporate pilots, "crop dusters," builders of amateur experimental aircraft, and recreational fliers. Application of stepwise discriminant analysis revealed that the pilots' education, political orientation, birth order, percent of flying for business purposes, participation in nonflying aviation activities, number of years of flying experience, and an index of aviation procedural noncompliance yielded statistically significant results. Furthermore, utilization of the classification capability of discriminant analysis produced a mathematical function which correctly allocated 78.5% of the cases into the appropriate groups, thus contributing to a 56.5% proportionate reduction in error over a random effects model. No relationship was found between accident involvement and several indicators of social attachments, socioeconomic status, and a number of measures of flying exposure. PMID:6732683

  19. Neutron Activation Analysis: Techniques and Applications

    SciTech Connect

    MacLellan, Ryan

    2011-04-27

    The role of neutron activation analysis in low-energy low-background experimentsis discussed in terms of comparible methods. Radiochemical neutron activation analysis is introduce. The procedure of instrumental neutron activation analysis is detailed especially with respect to the measurement of trace amounts of natural radioactivity. The determination of reactor neutron spectrum parameters required for neutron activation analysis is also presented.

  20. Environmental risk management for radiological accidents: integrating risk assessment and decision analysis for remediation at different spatial scales.

    PubMed

    Yatsalo, Boris; Sullivan, Terrence; Didenko, Vladimir; Linkov, Igor

    2011-07-01

    The consequences of the Tohuku earthquake and subsequent tsunami in March 2011 caused a loss of power at the Fukushima Daiichi nuclear power plant, in Japan, and led to the release of radioactive materials into the environment. Although the full extent of the contamination is not currently known, the highly complex nature of the environmental contamination (radionuclides in water, soil, and agricultural produce) typical of nuclear accidents requires a detailed geospatial analysis of information with the ability to extrapolate across different scales with applications to risk assessment models and decision making support. This article briefly summarizes the approach used to inform risk-based land management and remediation decision making after the Chernobyl, Soviet Ukraine, accident in 1986. PMID:21608109

  1. Development of the simulation system {open_quotes}IMPACT{close_quotes} for analysis of nuclear power plant severe accidents

    SciTech Connect

    Naitoh, Masanori; Ujita, Hiroshi; Nagumo, Hiroichi

    1997-07-01

    The Nuclear Power Engineering Corporation (NUPEC) has initiated a long-term program to develop the simulation system {open_quotes}IMPACT{close_quotes} for analysis of hypothetical severe accidents in nuclear power plants. IMPACT employs advanced methods of physical modeling and numerical computation, and can simulate a wide spectrum of senarios ranging from normal operation to hypothetical, beyond-design-basis-accident events. Designed as a large-scale system of interconnected, hierarchical modules, IMPACT`s distinguishing features include mechanistic models based on first principles and high speed simulation on parallel processing computers. The present plan is a ten-year program starting from 1993, consisting of the initial one-year of preparatory work followed by three technical phases: Phase-1 for development of a prototype system; Phase-2 for completion of the simulation system, incorporating new achievements from basic studies; and Phase-3 for refinement through extensive verification and validation against test results and available real plant data.

  2. Analysis of loss-of-coolant and loss-of-flow accidents in the first wall cooling system of NET/ITER

    NASA Astrophysics Data System (ADS)

    Komen, E. M. J.; Koning, H.

    1994-03-01

    This paper presents the thermal-hydraulic analysis of potential accidents in the first wall cooling system of the Next European Torus or the International Thermonuclear Experimental Reactor. Three ex-vessel loss-of-coolant accidents, two in-vessel loss-of-coolant accidents, and three loss-of-flow accidents have been analyzed using the thermal-hydraulic system analysis code RELAP5/MOD3. The analyses deal with the transient thermal-hydraulic behavior inside the cooling systems and the temperature development inside the nuclear components during these accidents. The analysis of the different accident scenarios has been performed without operation of emergency cooling systems. The results of the analyses indicate that a loss of forced coolant flow through the first wall rapidly causes dryout in the first wall cooling pipes. Following dryout, melting in the first wall starts within about 130 s in case of ongoing plasma burning. In case of large break LOCAs and ongoing plasma burning, melting in the first wall starts about 90 s after accident initiation.

  3. What can the drivers' own description from combined sources provide in an analysis of driver distraction and low vigilance in accident situations?

    PubMed

    Tivesten, Emma; Wiberg, Henrik

    2013-03-01

    Accident data play an important role in vehicle safety development. Accident data sources are generally limited in terms of how much information is provided on driver states and behaviour prior to an accident. However, the precise limitations vary between databases, due to differences in analysis focus and data collection procedures between organisations. If information about a specific accident can be retrieved from more than one data source it should be possible to combine the available information sets to facilitate data from one source to compensate for limitations in the other(s). To investigate the viability of such compensation, this study identified a set of accidents recorded in two different data sources. The first data source investigated was an accident mail survey and the second data source insurance claims documents consisting predominantly of insurance claims completed by the involved road users. An analysis of survey variables was compared to a case analysis including word data derived from the same survey and filed insurance claims documents. For each accident, the added value of having access to more than one source of information was assessed. To limit the scope of this study, three particular topics were investigated: available information on low vigilance (e.g., being drowsy, ill); secondary task distraction (e.g., talking with passengers, mobile phone use); and distraction related to the driving task (e.g., looking for approaching vehicles). Results suggest that for low vigilance and secondary task distraction, a combination of the mail survey and insurance claims documents provide more reliable and detailed pre-crash information than survey variables alone. However, driving related distraction appears to be more difficult to capture. In order to gain a better understanding of the above issues and how frequently they occur in accidents, the data sources and analysis methods suggested here may be combined with other investigation methods such

  4. Cochlear implant simulator for surgical technique analysis

    NASA Astrophysics Data System (ADS)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  5. DIANA: A multi-phase, multi-component hydrodynamic model for the analysis of severe accidents in heavy water reactors with multiple-tube assemblies

    SciTech Connect

    Tentner, A.M.

    1994-03-01

    A detailed hydrodynamic fuel relocation model has been developed for the analysis of severe accidents in Heavy Water Reactors with multiple-tube Assemblies. This model describes the Fuel Disruption and Relocation inside a nuclear fuel assembly and is designated by the acronym DIANA. DIANA solves the transient hydrodynamic equations for all the moving materials in the core and treats all the relevant flow regimes. The numerical solution techniques and some of the physical models included in DIANA have been developed taking advantage of the extensive experience accumulated in the development and validation of the LEVITATE (1) fuel relocation model of SAS4A [2, 3]. The model is designed to handle the fuel and cladding relocation in both voided and partially voided channels. It is able to treat a wide range of thermal/ hydraulic/neutronic conditions and the presence of various flow regimes at different axial locations within the same hydrodynamic channel.

  6. Severe Accident Sequence Analysis Program: Anticipated transient without scram simulations for Browns Ferry Nuclear Plant Unit 1

    SciTech Connect

    Dallman, R J; Gottula, R C; Holcomb, E E; Jouse, W C; Wagoner, S R; Wheatley, P D

    1987-05-01

    An analysis of five anticipated transients without scram (ATWS) was conducted at the Idaho National Engineering Laboratory (INEL). The five detailed deterministic simulations of postulated ATWS sequences were initiated from a main steamline isolation valve (MSIV) closure. The subject of the analysis was the Browns Ferry Nuclear Plant Unit 1, a boiling water reactor (BWR) of the BWR/4 product line with a Mark I containment. The simulations yielded insights to the possible consequences resulting from a MSIV closure ATWS. An evaluation of the effects of plant safety systems and operator actions on accident progression and mitigation is presented.

  7. Launch Vehicle Fire Accident Preliminary Analysis of a Liquid-Metal Cooled Thermionic Nuclear Reactor: TOPAZ-II

    NASA Astrophysics Data System (ADS)

    Hu, G.; Zhao, S.; Ruan, K.

    2012-01-01

    In this paper, launch vehicle propellant fire accident analysis of TOPAZ-II reactor has been done by a thermionic reactor core analytic code-TATRHG(A) developed by author. When a rocket explodes on a launch pad, its payload-TOPAZ-II can be subjected to a severe thermal environment from the resulting fireball. The extreme temperatures associated with propellant fires can create a destructive environment in or near the fireball. Different kind of propellants - liquid propellant and solid propellant which will lead to different fire temperature are considered. Preliminary analysis shows that the solid propellant fires can melt the whole toxic beryllium radial reflector.

  8. Analysis of general aviation accidents during operations under instrument flight rules

    NASA Technical Reports Server (NTRS)

    Bennett, C. T.; Schwirzke, Martin; Harm, C.

    1990-01-01

    A report is presented to describe some of the errors that pilots make during flight under IFR. The data indicate that there is less risk during the approach and landing phase of IFR flights, as compared to VFR operations. Single-pilot IFR accident rates continue to be higher than two-pilot IFR incident rates, reflecting the high work load of IFR operations.

  9. Traffic accident in Cuiabá-MT: an analysis through the data mining technology.

    PubMed

    Galvão, Noemi Dreyer; de Fátima Marin, Heimar

    2010-01-01

    The traffic road accidents (ATT) are non-intentional events with an important magnitude worldwide, mainly in the urban centers. This article aims to analyzes data related to the victims of ATT recorded by the Justice Secretariat and Public Security (SEJUSP) in hospital morbidity and mortality incidence at the city of Cuiabá-MT during 2006, using data mining technology. An observational, retrospective and exploratory study of the secondary data bases was carried out. The three database selected were related using the probabilistic method, through the free software RecLink. One hundred and thirty-nine (139) real pairs of victims of ATT were obtained. In this related database the data mining technology was applied with the software WEKA using the Apriori algorithm. The result generated 10 best rules, six of them were considered according to the parameters established that indicated a useful and comprehensible knowledge to characterize the victims of accidents in Cuiabá. Finally, the findings of the associative rules showed peculiarities of the road traffic accident victims in Cuiabá and highlight the need of prevention measures in the collision accidents for males. PMID:20841739

  10. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  11. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis

    SciTech Connect

    Goldhaber, M.K.; Staub, S.L.; Tokuhata, G.K.

    1983-07-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss.

  12. Incorporation of phenomenological uncertainties in probabilistic safety analysis - application to LMFBR core disruptive accident energetics

    SciTech Connect

    Najafi, B; Theofanous, T G; Rumble, E T; Atefi, B

    1984-08-01

    This report describes a method for quantifying frequency and consequence uncertainty distribution associated with core disruptive accidents (CDAs). The method was developed to estimate the frequency and magnitude of energy impacting the reactor vessel head of the Clinch River Breeder Plant (CRBRP) given the occurrence of hypothetical CDAs. The methodology is illustrated using the CRBR example.

  13. Risk Analysis for Public Consumption: Media Coverage of the Ginna Nuclear Reactor Accident.

    ERIC Educational Resources Information Center

    Dunwoody, Sharon; And Others

    Researchers have determined that the lay public makes risk judgments in ways that are very different from those advocated by scientists. Noting that these differences have caused considerable concern among those who promote and regulate health and safety, a study examined media coverage of the accident at the Robert E. Ginna nuclear power plant…

  14. Spontaneous abortions after the Three Mile Island nuclear accident: a life table analysis.

    PubMed Central

    Goldhaber, M K; Staub, S L; Tokuhata, G K

    1983-01-01

    A study was conducted to determine whether the incidence of spontaneous abortion was greater than expected near the Three Mile Island (TMI) nuclear power plant during the months following the March 28, 1979 accident. All persons living within five miles of TMI were registered shortly after the accident, and information on pregnancy at the time of the accident was collected. After one year, all pregnancy cases were followed up and outcomes ascertained. Using the life table method, it was found that, given pregnancies after four completed weeks of gestation counting from the first day of the last menstrual period, the estimated incidence of spontaneous abortion (miscarriage before completion of 16 weeks of gestation) was 15.1 per cent for women pregnant at the time of the TMI accident. Combining spontaneous abortions and stillbirths (delivery of a dead fetus after 16 weeks of gestation), the estimated incidence was 16.1 per cent for pregnancies after four completed weeks of gestation. Both incidences are comparable to baseline studies of fetal loss. PMID:6859357

  15. A comparison of different texture analysis techniques

    SciTech Connect

    Wright, S.I.; Kocks, U.F.

    1996-08-01

    With the advent of automated techniques for measuring individual crystallographic orientations using electron diffraction, there has been an increase in the use of local orientation measurements for measuring textures in polycrystalline materials. Several studies have focused on the number of single orientation measurements necessary to achieve the statistics of more conventional texture measurement, techniques such as pole figure measurement using x-ray and neutron diffraction. This investigation considers this question but also is extended to consider the nature of the differences between textures measured using individual orientation measurements and those measured using x-ray diffraction.

  16. A Longitudinal Analysis of the Causal Factors in Major Maritime Accidents in the USA and Canada (1996-2006)

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C, M.

    2007-01-01

    Accident reports provide important insights into the causes and contributory factors leading to particular adverse events. In contrast, this paper provides an analysis that extends across the findings presented over ten years investigations into maritime accidents by both the US National Transportation Safety Board (NTSB) and Canadian Transportation Safety Board (TSB). The purpose of the study was to assess the comparative frequency of a range of causal factors in the reporting of adverse events. In order to communicate our findings, we introduce J-H graphs as a means of representing the proportion of causes and contributory factors associated with human error, equipment failure and other high level classifications in longitudinal studies of accident reports. Our results suggest the proportion of causal and contributory factors attributable to direct human error may be very much smaller than has been suggested elsewhere in the human factors literature. In contrast, more attention should be paid to wider systemic issues, including the managerial and regulatory context of maritime operations.

  17. Validation and verification of RELAP5 for Advanced Neutron Source accident analysis: Part I, comparisons to ANSDM and PRSDYN codes

    SciTech Connect

    Chen, N.C.J.; Ibn-Khayat, M.; March-Leuba, J.A.; Wendel, M.W.

    1993-12-01

    As part of verification and validation, the Advanced Neutron Source reactor RELAP5 system model was benchmarked by the Advanced Neutron Source dynamic model (ANSDM) and PRSDYN models. RELAP5 is a one-dimensional, two-phase transient code, developed by the Idaho National Engineering Laboratory for reactor safety analysis. Both the ANSDM and PRSDYN models use a simplified single-phase equation set to predict transient thermal-hydraulic performance. Brief descriptions of each of the codes, models, and model limitations were included. Even though comparisons were limited to single-phase conditions, a broad spectrum of accidents was benchmarked: a small loss-of-coolant-accident (LOCA), a large LOCA, a station blackout, and a reactivity insertion accident. The overall conclusion is that the three models yield similar results if the input parameters are the same. However, ANSDM does not capture pressure wave propagation through the coolant system. This difference is significant in very rapid pipe break events. Recommendations are provided for further model improvements.

  18. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    SciTech Connect

    Gauntt, Randall O.; Mattie, Patrick D.

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  19. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  20. Survey of immunoassay techniques for biological analysis

    SciTech Connect

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs.

  1. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O., Jr.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  2. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  3. GAS CURTAIN EXPERIMENTAL TECHNIQUE AND ANALYSIS METHODOLOGIES

    SciTech Connect

    J. R. KAMM; ET AL

    2001-01-01

    The qualitative and quantitative relationship of numerical simulation to the physical phenomena being modeled is of paramount importance in computational physics. If the phenomena are dominated by irregular (i. e., nonsmooth or disordered) behavior, then pointwise comparisons cannot be made and statistical measures are required. The problem we consider is the gas curtain Richtmyer-Meshkov (RM) instability experiments of Rightley et al. (13), which exhibit complicated, disordered motion. We examine four spectral analysis methods for quantifying the experimental data and computed results: Fourier analysis, structure functions, fractal analysis, and continuous wavelet transforms. We investigate the applicability of these methods for quantifying the details of fluid mixing.

  4. Thermal analysis of an irradiated-fuel concrete integrated container under normal and fire-accident conditions. Report No. 89-242-K

    SciTech Connect

    Taralis, D.

    1990-01-01

    This study describes the development of the special purpose three-dimensional heat transfer computer code for the thermal analysis of a Concrete Integrated Container (CIC) for the transportation of 10-year cooled fuel under normal conditions and hypothetical fire accident conditions. Results are given for: Comparisons of theoretical predictions with existing half-scale CIC experimental results, and representative analytical results for full-scale CIC under normal and fire accident conditions.

  5. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  6. The Network Protocol Analysis Technique in Snort

    NASA Astrophysics Data System (ADS)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  7. Analysis of potential for jet-impingement erosion from leaking steam generator tubes during severe accidents.

    SciTech Connect

    Majumdar, S.; Diercks, D. R.; Shack, W. J.; Energy Technology

    2002-05-01

    This report summarizes analytical evaluation of crack-opening areas and leak rates of superheated steam through flaws in steam generator tubes and erosion of neighboring tubes due to jet impingement of superheated steam with entrained particles from core debris created during severe accidents. An analytical model for calculating crack-opening area as a function of time and temperature was validated with tests on tubes with machined flaws. A three-dimensional computational fluid dynamics code was used to calculate the jet velocity impinging on neighboring tubes as a function of tube spacing and crack-opening area. Erosion tests were conducted in a high-temperature, high-velocity erosion rig at the University of Cincinnati, using micrometer-sized nickel particles mixed in with high-temperature gas from a burner. The erosion results, together with analytical models, were used to estimate the erosive effects of superheated steam with entrained aerosols from the core during severe accidents.

  8. Loss of DHR sequences at Browns Ferry Unit One - accident-sequence analysis

    SciTech Connect

    Cook, D.H.; Grene, S.R.; Harrington, R.M.; Hodge, S.A.

    1983-05-01

    This study describes the predicted response of Unit One at the Browns Ferry Nuclear Plant to a postulated loss of decay heat removal (DHR) capability following scram from full power with the power conversion system unavailable. In accident sequences without DHR capability, the residual heat removal (RHR) system functions of pressure suppression pool cooling and reactor vessel shutdown cooling are unavailable. Consequently, all decay heat energy is stored in the pressure suppression pool with a concomitant increase in pool temperature and primary containment pressure. With the assumption that DHR capability is not regained during the lengthy course of this accident sequence, the containment ultimately fails by overpressurization. Although unlikely, this catastrophic failure might lead to loss of the ability to inject cooling water into the reactor vessel, causing subsequent core uncovery and meltdown. The timing of these events and the effective mitigating actions that might be taken by the operator are discussed in this report.

  9. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... published a proposed rule in the Federal Register at 77 FR 40552 on July 10, 2012, to clarify and pinpoint a... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a...

  10. Soil Analysis using the semi-parametric NAA technique

    SciTech Connect

    Zamboni, C. B.; Silveira, M. A. G.; Medina, N. H.

    2007-10-26

    The semi-parametric Neutron Activation Analysis technique, using Au as a flux monitor, was applied to measure element concentrations of Br, Ca, Cl, K, Mn and Na for soil characterization. The results were compared with those using the Instrumental Neutron Activation Analysis technique and they found to be compatible. The viability, advantages, and limitations of using these two analytic methodologies are discussed.

  11. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  12. 3D analysis of the reactivity insertion accident in VVER-1000

    SciTech Connect

    Abdullayev, A. M.; Zhukov, A. I.; Slyeptsov, S. M.

    2012-07-01

    Fuel parameters such as peak enthalpy and temperature during rod ejection accident are calculated. The calculations are performed by 3D neutron kinetics code NESTLE and 3D thermal-hydraulic code VIPRE-W. Both hot zero power and hot full power cases were studied for an equilibrium cycle with Westinghouse hex fuel in VVER-1000. It is shown that the use of 3D methodology can significantly increase safety margins for current criteria and met future criteria. (authors)

  13. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  14. SiC MODIFICATIONS TO MELCOR FOR SEVERE ACCIDENT ANALYSIS APPLICATIONS

    SciTech Connect

    Brad J. Merrill; Shannon M Bragg-Sitton

    2013-09-01

    The Department of Energy (DOE) Office of Nuclear Energy (NE) Light Water Reactor (LWR) Sustainability Program encompasses strategic research focused on improving reactor core economics and safety margins through the development of an advanced fuel cladding system. The Fuels Pathway within this program focuses on fuel system components outside of the fuel pellet, allowing for alteration of the existing zirconium-based clad system through coatings, addition of ceramic sleeves, or complete replacement (e.g. fully ceramic cladding). The DOE-NE Fuel Cycle Research & Development (FCRD) Advanced Fuels Campaign (AFC) is also conducting research on materials for advanced, accident tolerant fuels and cladding for application in operating LWRs. To aide in this assessment, a silicon carbide (SiC) version of the MELCOR code was developed by substituting SiC in place of Zircaloy in MELCOR’s reactor core oxidation and material property routines. The purpose of this development effort is to provide a numerical capability for estimating the safety advantages of replacing Zr-alloy components in LWRs with SiC components. This modified version of the MELCOR code was applied to the Three Mile Island (TMI-2) plant accident. While the results are considered preliminary, SiC cladding showed a dramatic safety advantage over Zircaloy cladding during this accident.

  15. Analysis of Radionuclide Releases from the Fukushima Dai-ichi Nuclear Power Plant Accident Part II

    NASA Astrophysics Data System (ADS)

    Achim, Pascal; Monfort, Marguerite; Le Petit, Gilbert; Gross, Philippe; Douysset, Guilhem; Taffary, Thomas; Blanchard, Xavier; Moulin, Christophe

    2014-03-01

    The present part of the publication (Part II) deals with long range dispersion of radionuclides emitted into the atmosphere during the Fukushima Dai-ichi accident that occurred after the March 11, 2011 tsunami. The first part (Part I) is dedicated to the accident features relying on radionuclide detections performed by monitoring stations of the Comprehensive Nuclear Test Ban Treaty Organization network. In this study, the emissions of the three fission products Cs-137, I-131 and Xe-133 are investigated. Regarding Xe-133, the total release is estimated to be of the order of 6 × 1018 Bq emitted during the explosions of units 1, 2 and 3. The total source term estimated gives a fraction of core inventory of about 8 × 1018 Bq at the time of reactors shutdown. This result suggests that at least 80 % of the core inventory has been released into the atmosphere and indicates a broad meltdown of reactor cores. Total atmospheric releases of Cs-137 and I-131 aerosols are estimated to be 1016 and 1017 Bq, respectively. By neglecting gas/particulate conversion phenomena, the total release of I-131 (gas + aerosol) could be estimated to be 4 × 1017 Bq. Atmospheric transport simulations suggest that the main air emissions have occurred during the events of March 14, 2011 (UTC) and that no major release occurred after March 23. The radioactivity emitted into the atmosphere could represent 10 % of the Chernobyl accident releases for I-131 and Cs-137.

  16. Visualization techniques for malware behavior analysis

    NASA Astrophysics Data System (ADS)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  17. Nuclear reaction techniques in materials analysis

    SciTech Connect

    Amsel, G.; Lanford, W.A.

    1984-01-01

    This article discusses nuclear reaction microanalysis (NRA). In NRA, data accumulated in the frame of low-energy nuclear physics is put to advantage for analytical purposes. Unknown targets are bombarded and known reactions are observed. For NRA, the accelerator, detectors, spectrum recording and interpretation must be reliable, simple, and fast. Other MeV ion-beam analytical techniques are described which are complementary to NRA, such as Rutherford backscattering (RBS), proton-induced x-ray emission (PIXE), and the more recent method of elastic recoil detection (ERD). Applications for NRA range from solid-state physics and electrochemistry, semiconductor technology, metallurgy, materials science, and surface science to biology and archeology.

  18. An analysis of thermionic space nuclear reactor power system: I. Effect of disassembling radial reflector, following a reactivity initiated accident

    SciTech Connect

    El-Genk, M.S.; Paramonov, D. )

    1993-01-10

    An analysis is performed to determine the effect of disassembling the radial reflector of the TOPAZ-II reactor, following a hypothetical severe Reactivity Initiated Accident (RIA). Such an RIA is assumed to occur during the system start-up in orbit due to a malfunction of the drive mechanism of the control drums, causing the drums to rotate the full 180[degree] outward at their maximum speed of 1.4[degree]/s. Results indicate that disassembling only three of twelve radial reflector panels would successfully shutdown the reactor, with little overheating of the fuel and the moderator.

  19. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line...

  20. Injection Locking Techniques for Spectrum Analysis

    SciTech Connect

    Gathma, Timothy D.; Buckwalter, James F.

    2011-04-19

    Wideband spectrum analysis supports future communication systems that reconfigure and adapt to the capacity of the spectral environment. While test equipment manufacturers offer wideband spectrum analyzers with excellent sensitivity and resolution, these spectrum analyzers typically cannot offer acceptable size, weight, and power (SWAP). CMOS integrated circuits offer the potential to fully integrate spectrum analysis capability with analog front-end circuitry and digital signal processing on a single chip. Unfortunately, CMOS lacks high-Q passives and wideband resonator tunability that is necessary for heterodyne implementations of spectrum analyzers. As an alternative to the heterodyne receiver architectures, two nonlinear methods for performing wideband, low-power spectrum analysis are presented. The first method involves injecting the spectrum of interest into an array of injection-locked oscillators. The second method employs the closed loop dynamics of both injection locking and phase locking to independently estimate the injected frequency and power.

  1. Uncertainty analysis technique for OMEGA Dante measurementsa)

    NASA Astrophysics Data System (ADS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  2. Uncertainty Analysis Technique for OMEGA Dante Measurements

    SciTech Connect

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  3. Uncertainty analysis technique for OMEGA Dante measurements

    SciTech Connect

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-15

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  4. Automated fluid analysis apparatus and techniques

    DOEpatents

    Szecsody, James E.

    2004-03-16

    An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.

  5. Permethylation Linkage Analysis Techniques for Residual Carbohydrates

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Permethylation analysis is the classic approach to establishing the position of glycosidic linkages between sugar residues. Typically, the carbohydrate is derivatized to form acid-stable methyl ethers, hydrolyzed, peracetylated, and analyzed by gas chromatography-mass spectrometry (GC-MS). The pos...

  6. Comparison of Hydrogen Sulfide Analysis Techniques

    ERIC Educational Resources Information Center

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  7. Cognitive task analysis: Techniques applied to airborne weapons training

    SciTech Connect

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E.; Carlow Associates, Inc., Fairfax, VA; Martin Marietta Energy Systems, Inc., Oak Ridge, TN; Tennessee Univ., Knoxville, TN )

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  8. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    NASA Astrophysics Data System (ADS)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  9. Advanced Techniques for Root Cause Analysis

    2000-09-19

    Five items make up this package, or can be used individually. The Chronological Safety Management Template utilizes a linear adaptation of the Integrated Safety Management System laid out in the form of a template that greatly enhances the ability of the analyst to perform the first step of any investigation which is to gather all pertinent facts and identify causal factors. The Problem Analysis Tree is a simple three (3) level problem analysis tree whichmore » is easier for organizations outside of WSRC to use. Another part is the Systemic Root Cause Tree. One of the most basic and unique features of Expanded Root Cause Analysis is the Systemic Root Cause portion of the Expanded Root Cause Pyramid. The Systemic Root Causes are even more basic than the Programmatic Root Causes and represent Root Causes that cut across multiple (if not all) programs in an organization. the Systemic Root Cause portion contains 51 causes embedded at the bottom level of a three level Systemic Root Cause Tree that is divided into logical, organizationally based categorie to assist the analyst. The Computer Aided Root Cause Analysis that allows the analyst at each level of the Pyramid to a) obtain a brief description of the cause that is being considered, b) record a decision that the item is applicable, c) proceed to the next level of the Pyramid to see only those items at the next level of the tree that are relevant to the particular cause that has been chosen, and d) at the end of the process automatically print out a summary report of the incident, the causal factors as they relate to the safety management system, the probable causes, apparent causes, Programmatic Root Causes and Systemic Root Causes for each causal factor and the associated corrective action.« less

  10. Measurement techniques in animal locomotion analysis.

    PubMed

    Schamhardt, H C; van den Bogert, A J; Hartman, W

    1993-01-01

    Animal performance can be determined by subjective observations or objective measurements. Numerical data are only then superior to results of subjective observations when they are the result of measurements carried out to test a well-defined hypothesis or to give the answer to a clear, precisely formulated question. In the analysis of kinematics a careful evaluation of the set-up of the measurement equipment and the resulting accuracy in the data is required. Measurements in three dimensions (3D) are theoretically better than those in 2D. Practically, however, collection, analysis, interpretation and presentation of 3D data are so much more complicated that frequently 2D analysis appears to be more useful. The minimal size of markers necessary to obtain a certain accuracy in kinematic data is usually too big for practical use. Smaller markers impair accuracy. Reduction of measurement noise is obligatory when time derivatives are to be calculated. Skin movement artefacts cannot be removed by data smoothing. Forces occurring between the digits and the ground can be determined using a force plate or an instrumented shoe. A force plate is accurate, but repeated trials are necessary. Using a force shoe each ground contact results in useful data. However, the shoe itself may affect locomotion. Surface strains on long bones can be recorded relatively easily. Determination of loading forces from surface strains is complicated but can be carried out using multiple strain gauges and a post-mortem calibration test. Strain in tendons is difficult to measure due to problems in defining a'zero' or reference length.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8470454

  11. The effects of aircraft certification rules on general aviation accidents

    NASA Astrophysics Data System (ADS)

    Anderson, Carolina Lenz

    The purpose of this study was to analyze the frequency of general aviation airplane accidents and accident rates on the basis of aircraft certification to determine whether or not differences in aircraft certification rules had an influence on accidents. In addition, the narrative cause descriptions contained within the accident reports were analyzed to determine whether there were differences in the qualitative data for the different certification categories. The certification categories examined were: Federal Aviation Regulations Part 23, Civil Air Regulations 3, Light Sport Aircraft, and Experimental-Amateur Built. The accident causes examined were those classified as: Loss of Control, Controlled Flight into Terrain, Engine Failure, and Structural Failure. Airworthiness certification categories represent a wide diversity of government oversight. Part 23 rules have evolved from the initial set of simpler design standards and have progressed into a comprehensive and strict set of rules to address the safety issues of the more complex airplanes within the category. Experimental-Amateur Built airplanes have the least amount of government oversight and are the fastest growing segment. The Light Sport Aircraft category is a more recent certification category that utilizes consensus standards in the approval process. Civil Air Regulations 3 airplanes were designed and manufactured under simpler rules but modifying these airplanes has become lengthy and expensive. The study was conducted using a mixed methods methodology which involves both quantitative and qualitative elements. A Chi-Square test was used for a quantitative analysis of the accident frequency among aircraft certification categories. Accident rate analysis of the accidents among aircraft certification categories involved an ANCOVA test. The qualitative component involved the use of text mining techniques for the analysis of the narrative cause descriptions contained within the accident reports. The Chi

  12. Analysis of Japanese Radionuclide Monitoring Data of Food Before and After the Fukushima Nuclear Accident

    PubMed Central

    2015-01-01

    In an unprecedented food monitoring campaign for radionuclides, the Japanese government took action to secure food safety after the Fukushima nuclear accident (Mar. 11, 2011). In this work we analyze a part of the immense data set, in particular radiocesium contaminations in food from the first year after the accident. Activity concentrations in vegetables peaked immediately after the campaign had commenced, but they decreased quickly, so that by early summer 2011 only a few samples exceeded the regulatory limits. Later, accumulating mushrooms and dried produce led to several exceedances of the limits again. Monitoring of meat started with significant delay, especially outside Fukushima prefecture. After a buildup period, contamination levels of meat peaked by July 2011 (beef). Levels then decreased quickly, but peaked again in September 2011, which was primarily due to boar meat (a known accumulator of radiocesium). Tap water was less contaminated; any restrictions for tap water were canceled by April 1, 2011. Pre-Fukushima 137Cs and 90Sr levels (resulting from atmospheric nuclear explosions) in food were typically lower than 0.5 Bq/kg, whereby meat was typically higher in 137Cs and vegetarian produce was usually higher in 90Sr. The correlation of background radiostrontium and radiocesium indicated that the regulatory assumption after the Fukushima accident of a maximum activity of 90Sr being 10% of the respective 137Cs concentrations may soon be at risk, as the 90Sr/137Cs ratio increases with time. This should be taken into account for the current Japanese food policy as the current regulation will soon underestimate the 90Sr content of Japanese foods. PMID:25621976

  13. Speckle-adaptive VISAR fringe analysis technique

    NASA Astrophysics Data System (ADS)

    Erskine, David

    2015-06-01

    A line-VISAR (velocity interferometer) is an important diagnostic in shock physics, simultaneously measuring many fringe histories of adjacent portions of a target splayed along a line on a target, with fringes recorded vs time and space by a streak camera. Due to laser illumination speckle (spatial intensity variation), target surface unevenness, or rapid spatial variation of target physics, conventional fringe analysis algorithms which do not properly model these variations can suffer from inferred velocity (fringe phase) errors. A speckle-adaptive algorithm has been developed which senses the interferometer and illumination parameters for each individual row (spatial position Y) of the 2d interferogram, so that the interferogram can be compensated for Y-dependent nonfringing intensity, fringe visibility, and nonlinear phase distribution. In numerical simulations and on actual data we have found this individual row-by-row modeling improves the accuracy of the result, compared to a conventional column-by-column analysis approach. Prepared by LLNL under Contract DE-AC52-07NA27344.

  14. A technique for human error analysis (ATHEANA)

    SciTech Connect

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  15. Analysis of a small break loss-of-coolant accident of pressurized water reactor by APROS

    SciTech Connect

    Al-Falahi, A.; Haennine, M.; Porkholm, K.

    1995-09-01

    The purpose of this paper is to study the capability of APROS (Advanced PROcess Simulator) code to simulate the real plant thermal-hydraulic transient of a Small Break Loss-Of-Coolant Accident (SBLOCA) of Loss-Of-Fluid Test (LOFT) facility. The LOFT is a scaled model of a Pressurized Water Reactor (PWR). This work is a part of a larger validation of the APROS thermal-hydraulic models. The results of SBLOCA transient calculated by APROS showed a reasonable agreement with the measured data.

  16. SACO-1: a fast-running LMFBR accident-analysis code

    SciTech Connect

    Mueller, C.J.; Cahalan, J.E.; Vaurio, J.K.

    1980-01-01

    SACO is a fast-running computer code that simulates hypothetical accidents in liquid-metal fast breeder reactors to the point of permanent subcriticality or to the initiation of a prompt-critical excursion. In the tradition of the SAS codes, each subassembly is modeled by a representative fuel pin with three distinct axial regions to simulate the blanket and core regions. However, analytic and integral models are used wherever possible to cut down the computing time and storage requirements. The physical models and basic equations are described in detail. Comparisons of SACO results to analogous SAS3D results comprise the qualifications of SACO and are illustrated and discussed.

  17. Analysis of reactivity-insertion accidents in the TREAT Upgrade reactor

    SciTech Connect

    Rudolph, R.R.; Bhattacharyya, S.K.

    1983-01-01

    The expansion of the experimental capabilities of the TREAT Upgrade (TU) reactor also tends to increase the potential risks associated with off-normal reactivity insertion incidents compared to the TREAT reactor. To provide adequate prtection for the public and the facility, while meeting experimenter's requirements, a specialized Reactor Trip System (RTS) with energy-dependent scram trips on reactor power and period has been developed. With this protection strategy, the consequences of reactivity insertion accidents in the TU reactor have been analyzed using a general methodology developed earlier. Results of these analyses are presented.

  18. Resolve! Version 2.5: Flammable Gas Accident Analysis Tool Acceptance Test Plan and Test Results

    SciTech Connect

    LAVENDER, J.C.

    2000-10-17

    RESOLVE! Version 2 .5 is designed to quantify the risk and uncertainty of combustion accidents in double-shell tanks (DSTs) and single-shell tanks (SSTs). The purpose of the acceptance testing is to ensure that all of the options and features of the computer code run; to verify that the calculated results are consistent with each other; and to evaluate the effects of the changes to the parameter values on the frequency and consequence trends associated with flammable gas deflagrations or detonations.

  19. Investigation of electroforming techniques, literature analysis report

    NASA Technical Reports Server (NTRS)

    Malone, G. A.

    1975-01-01

    A literature analysis is presented of reports, specifications, and documented experiences with the use of electroforming to produce copper and nickel structures for aerospace and other engineering applications. The literature period covered is from 1948 to 1974. Specific effort was made to correlate mechanical property data for the electrodeposited material with known electroforming solution compositions and operating conditions. From this survey, electrolytes are suggested for selection to electroform copper and nickel outer shells on regeneratively cooled thrust chamber liners, and other devices subject to thermal and pressure exposure, based on mechanical properties obtainable, performance under various thermal environments, and ease of process control for product reproducibility. Processes of potential value in obtaining sound bonds between electrodeposited copper and nickel and copper alloy substrates are also discussed.

  20. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-04-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations performed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  1. Analysis of an AP600 intermediate-size loss-of-coolant accident

    SciTech Connect

    Boyack, B.E.; Lime, J.F.

    1995-09-01

    A postulated double-ended guillotine break of an AP600 direct-vessel-injection line has been analyzed. This event is characterized as an intermediate-break loss-of-coolant accident. Most of the insights regarding the response of the AP600 safety systems to the postulated accident are derived from calculations preformed with the TRAC-PF1/MOD2 code. However, complementary insights derived from a scaled experiment conducted in the ROSA facility, as well as insights based upon calculations by other codes, are also presented. Based upon the calculated and experimental results, the AP600 will not experience a core heat up and will reach a safe shutdown state using only safety-class equipment. Only the early part of the long-term cooling period initiated by In-containment Refueling Water Storage Tank injection was evaluated. Thus, the observation that the core is continuously cooled should be verified for the later phase of the long-term cooling period when sump injection and containment cooling processes are important.

  2. Analysis of injuries among pilots involved in fatal general aviation airplane accidents.

    PubMed

    Wiegmann, Douglas A; Taneja, Narinder

    2003-07-01

    The purpose of this study was to analyze patterns of injuries sustained by pilots involved in fatal general aviation (GA) airplane accidents. Detailed information on the pattern and nature of injuries was retrieved from the Federal Aviation Administration's autopsy database for pilots involved in fatal GA airplane accidents from 1996 to 1999. A review of 559 autopsies revealed that blunt trauma was the primary cause of death in 86.0% (N=481) of the autopsies. The most commonly occurring bony injuries were fracture of the ribs (72.3%), skull (55.1%), facial bones (49.4%), tibia (37.9%) and pelvis (36.0%). Common organ injuries included laceration of the liver (48.1%), lung (37.6%) heart (35.6%), and spleen (30.1%), and hemorrhage of the brain (33.3%) and lung (32.9%). A fractured larynx was observed in 14.7% of the cases, a finding that has not been reported in literature until now. It was observed that individuals who sustained brain hemorrhage were also more likely to have fractures of the facial bones rather than skull fractures. PMID:12729820

  3. Analysis and calibration techniques for superconducting resonators.

    PubMed

    Cataldo, Giuseppe; Wollack, Edward J; Barrentine, Emily M; Brown, Ari D; Moseley, S Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented. PMID:25638068

  4. Analysis and calibration techniques for superconducting resonators

    NASA Astrophysics Data System (ADS)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  5. Thermal analysis of the 10-gallon and the 55-gallon DOT-6M containers with thermal boundary conditions corresponding to 10CFR71 normal transport and accident conditions

    SciTech Connect

    Sanchez, L.C.; Longenbaugh, R.S.; Moss, M.; Haseman, G.M.; Fowler, W.E.; Roth, E.P.

    1988-03-01

    This report describes the heat transfer analysis of the 10-gallon and 55-gallon 6M containers. The analysis was performed with boundary conditions corresponding to a normal transport condition and a hypothetical accident condition. Computational results indicated that the insulation material in the 6M containers will adequately protect the payload region of the 6M containers. 26 refs., 26 figs., 8 tabs.

  6. Search for the top quark using multivariate analysis techniques

    SciTech Connect

    Bhat, P.C.; D0 Collaboration

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the e{mu} channel and neural networks to the e+jets channel.

  7. Health effects models for nuclear power plant accident consequence analysis. Part 1, Introduction, integration, and summary: Revision 2

    SciTech Connect

    Evans, J.S.; Abrahmson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-10-01

    This report is a revision of NUREG/CR-4214, Rev. 1, Part 1 (1990), Health Effects Models for Nuclear Power Plant Accident Consequence Analysis. This revision has been made to incorporate changes to the Health Effects Models recommended in two addenda to the NUREG/CR-4214, Rev. 1, Part 11, 1989 report. The first of these addenda provided recommended changes to the health effects models for low-LET radiations based on recent reports from UNSCEAR, ICRP and NAS/NRC (BEIR V). The second addendum presented changes needed to incorporate alpha-emitting radionuclides into the accident exposure source term. As in the earlier version of this report, models are provided for early and continuing effects, cancers and thyroid nodules, and genetic effects. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes are considered. Linear and linear-quadratic models are recommended for estimating the risks of seven types of cancer in adults - leukemia, bone, lung, breast, gastrointestinal, thyroid, and ``other``. For most cancers, both incidence and mortality are addressed. Five classes of genetic diseases -- dominant, x-linked, aneuploidy, unbalanced translocations, and multifactorial diseases are also considered. Data are provided that should enable analysts to consider the timing and severity of each type of health risk.

  8. Analysis of Accidents at the Pakistan Research Reactor-1 Using Proposed Mixed-Fuel (HEU and LEU) Core

    SciTech Connect

    Bokhari, Ishtiaq H.

    2004-12-15

    The Pakistan Research Reactor-1 (PARR-1) was converted from highly enriched uranium (HEU) to low-enriched uranium (LEU) fuel in 1991. The reactor is running successfully, with an upgraded power level of 10 MW. To save money on the purchase of costly fresh LEU fuel elements, the use of less burnt HEU spent fuel elements along with the present LEU fuel elements is being considered. The proposal calls for the HEU fuel elements to be placed near the thermal column to gain the required excess reactivity. In the present study the safety analysis of a proposed mixed-fuel core has been carried out at a calculated steady-state power level of 9.8 MW. Standard computer codes and correlations were employed to compute various parameters. Initiating events in reactivity-induced accidents involve various modes of reactivity insertion, namely, start-up accident, accidental drop of a fuel element on the core, flooding of a beam tube with water, and removal of an in-pile experiment during reactor operation. For each of these transients, time histories of reactor power, energy released, temperature, and reactivity were determined.

  9. TRAC large-break loss-of-coolant accident analysis for the AP600 design

    SciTech Connect

    Lime, J.F.; Boyack, B.E.

    1994-02-01

    This report discusses a TRAC model of the Westinghouse AP600 advanced reactor design which has been developed for analyzing large-break loss-of-coolant accident (LBLOCA) transients. A preliminary LBLOCA calculation of a 80% cold-leg break has been performed with TRAC-PF1/MOD2. The 80% break size was calculated by Westinghouse to be the most severe large-break size. The LBLOCA transient was calculated to 92 s. Peak clad temperatures (PCT) were well below the Appendix K limit of 1478 K (2200{degrees}F). Transient event times and PCT for the TRAC calculation were in reasonable agreement with those calculated by Westinghouse using their WCOBRA/TRAC code.

  10. Radiological health effects models for nuclear power plant accident consequence analysis.

    PubMed

    Evans, J S; Moeller, D W

    1989-04-01

    Improved health effects models have been developed for assessing the early effects, late somatic effects and genetic effects that might result from low-LET radiation exposures to populations following a major accident in a nuclear power plant. All the models have been developed in such a way that the dynamics of population risks can be analyzed. Estimates of life years lost and the duration of illnesses were generated and a framework recommended for summarizing health impacts. Uncertainty is addressed by providing models for upper, central and lower estimates of most effects. The models are believed to be a significant improvement over the models used in the U.S. Nuclear Regulatory Commission's Reactor Safety Study, and they can easily be modified to reflect advances in scientific understanding of the health effects of ionizing radiation. PMID:2925380

  11. Comparison of laser transit anemometry data analysis techniques

    NASA Technical Reports Server (NTRS)

    Humphreys, William M., Jr.; Gartrell, Luther R.

    1991-01-01

    Two techniques for the extraction of two-dimensional flow information from laser transit anemometry (LTA) data sets are presented and compared via a simulation study and experimental investigation. The methods are a probability density function (PDF) estimation technique and a marginal distribution analysis technique. The simulation study builds on the results of previous work and provides a quantification of the accuracy of both techniques for various LTA data acquisition scenarios. The experimental comparison consists of using an LTA system to survey the flow downstream of a turbulence generator in a small low-speed wind tunnel. The collected data sets are analyzed and compared.

  12. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  13. TRANSIENT ACCIDENT ANALYSIS OF THE GLOVEBOX SYSTEM IN A LARGE PROCESS ROOM

    SciTech Connect

    Lee, S

    2008-01-11

    Local transient hydrogen concentrations were evaluated inside a large process room when the hydrogen gas was released by three postulated accident scenarios associated with the process tank leakage and fire leading to a loss of gas confinement. The three cases considered in this work were fire in a room, loss of confinement from a process tank, and loss of confinement coupled with fire event. Based on these accident scenarios in a large and unventilated process room, the modeling calculations of the hydrogen migration were performed to estimate local transient concentrations of hydrogen due to the sudden leakage and release from a glovebox system associated with the process tank. The modeling domain represented the major features of the process room including the principal release or leakage source of gas storage system. The model was benchmarked against the literature results for key phenomena such as natural convection, turbulent behavior, gas mixing due to jet entrainment, and radiation cooling because these phenomena are closely related to the gas driving mechanisms within a large air space of the process room. The modeling results showed that at the corner of the process room, the gas concentrations migrated by the Case 2 and Case 3 scenarios reached the set-point value of high activity alarm in about 13 seconds, while the Case 1 scenario takes about 90 seconds to reach the concentration. The modeling results were used to estimate transient radioactive gas migrations in an enclosed process room installed with high activity alarm monitor when the postulated leakage scenarios are initiated without room ventilation.

  14. Hyphenated techniques and their applications in natural products analysis.

    PubMed

    Sarker, Satyajit D; Nahar, Lutfun

    2012-01-01

    A technique where a separation technique is coupled with an online spectroscopic detection technology is known as hyphenated technique, e.g., GC-MS, LC-PDA, LC-MS, LC-FTIR, LC-NMR, LC-NMR-MS, and CE-MS. Recent advances in hyphenated analytical techniques have remarkably widened their applications to the analysis of complex biomaterials, especially natural products. This chapter focuses on the applications of hyphenated techniques to pre-isolation and isolation of natural products, dereplication, online partial identification of compounds, chemotaxonomic studies, chemical finger-printing, quality control of herbal products, and metabolomic studies, and presents specific examples. However, a particular emphasis has been given on the hyphenated techniques that involve an LC as the separation tool. PMID:22367902

  15. An aftermath analysis of the 2014 coal mine accident in Soma, Turkey: Use of risk performance indicators based on historical experience.

    PubMed

    Spada, Matteo; Burgherr, Peter

    2016-02-01

    On the 13th of May 2014 a fire related incident in the Soma coal mine in Turkey caused 301 fatalities and more than 80 injuries. This has been the largest coal mine accident in Turkey, and in the OECD country group, so far. This study investigated if such a disastrous event should be expected, in a statistical sense, based on historical observations. For this purpose, PSI's ENSAD database is used to extract accident data for the period 1970-2014. Four different cases are analyzed, i.e., OECD, OECD w/o Turkey, Turkey and USA. Analysis of temporal trends for annual numbers of accidents and fatalities indicated a non-significant decreasing tendency for OECD and OECD w/o Turkey and a significant one for USA, whereas for Turkey both measures showed an increase over time. The expectation analysis revealed clearly that an event with the consequences of the Soma accident is rather unlikely for OECD, OECD w/o Turkey and USA. In contrast, such a severe accident has a substantially higher expectation for Turkey, i.e. it cannot be considered an extremely rare event, based on historical experience. This indicates a need for improved safety measures and stricter regulations in the Turkish coal mining sector in order to get closer to the rest of OECD. PMID:26687539

  16. Analysis of containment performance and radiological consequences under severe accident conditions for the Advanced Neutron Source Reactor at the Oak Ridge National Laboratory

    SciTech Connect

    Kim, S.H.; Taleyarkhan, R.P.

    1994-01-01

    A severe accident study was conducted to evaluate conservatively scoped source terms and radiological consequences to support the Advanced Neutron Source (ANS) Conceptual Safety Analysis Report (CSAR). Three different types of severe accident scenarios were postulated with a view of evaluating conservatively scoped source terms. The first scenario evaluates maximum possible steaming loads and associated radionuclide transport, whereas the next scenario is geared towards evaluating conservative containment loads from releases of radionuclide vapors and aerosols with associated generation of combustible gases. The third scenario follows the prescriptions given by the 10 CFR 100 guidelines. It was included in the CSAR for demonstrating site-suitability characteristics of the ANS. Various containment configurations are considered for the study of thermal-hydraulic and radiological behaviors of the ANS containment. Severe accident mitigative design features such as the use of rupture disks were accounted for. This report describes the postulated severe accident scenarios, methodology for analysis, modeling assumptions, modeling of several severe accident phenomena, and evaluation of the resulting source term and radiological consequences.

  17. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    ERIC Educational Resources Information Center

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  18. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  19. New Flutter Analysis Technique for CFD-based Unsteady Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Jutte, Christine V.

    2009-01-01

    This paper presents a flutter analysis technique for the transonic flight regime. The technique uses an iterative approach to determine the critical dynamic pressure for a given mach number. Unlike other CFD-based flutter analysis methods, each iteration solves for the critical dynamic pressure and uses this value in subsequent iterations until the value converges. This process reduces the iterations required to determine the critical dynamic pressure. To improve the accuracy of the analysis, the technique employs a known structural model, leaving only the aerodynamic model as the unknown. The aerodynamic model is estimated using unsteady aeroelastic CFD analysis combined with a parameter estimation routine. The technique executes as follows. The known structural model is represented as a finite element model. Modal analysis determines the frequencies and mode shapes for the structural model. At a given mach number and dynamic pressure, the unsteady CFD analysis is performed. The output time history of the surface pressure is converted to a nodal aerodynamic force vector. The forces are then normalized by the given dynamic pressure. A multi-input multi-output parameter estimation software, ERA, estimates the aerodynamic model through the use of time histories of nodal aerodynamic forces and structural deformations. The critical dynamic pressure is then calculated using the known structural model and the estimated aerodynamic model. This output is used as the dynamic pressure in subsequent iterations until the critical dynamic pressure is determined. This technique is demonstrated on the Aerostructures Test Wing-2 model at NASA's Dryden Flight Research Center.

  20. Fault tree analysis of fire and explosion accidents for dual fuel (diesel/natural gas) ship engine rooms

    NASA Astrophysics Data System (ADS)

    Guan, Yifeng; Zhao, Jie; Shi, Tengfei; Zhu, Peipei

    2016-07-01

    In recent years, China's increased interest in environmental protection has led to a promotion of energy-efficient dual fuel (diesel/natural gas) ships in Chinese inland rivers. A natural gas as ship fuel may pose dangers of fire and explosion if a gas leak occurs. If explosions or fires occur in the engine rooms of a ship, heavy damage and losses will be incurred. In this paper, a fault tree model is presented that considers both fires and explosions in a dual fuel ship; in this model, dual fuel engine rooms are the top events. All the basic events along with the minimum cut sets are obtained through the analysis. The primary factors that affect accidents involving fires and explosions are determined by calculating the degree of structure importance of the basic events. According to these results, corresponding measures are proposed to ensure and improve the safety and reliability of Chinese inland dual fuel ships.

  1. Tools for improving safety management in the Norwegian Fishing Fleet occupational accidents analysis period of 1998-2006.

    PubMed

    Aasjord, Halvard L

    2006-01-01

    Reporting of human accidents in the Norwegian Fishing Fleet has always been very difficult because there has been no tradition in making reports on all types of working accidents among fishermen, if the accident does not seem to be very serious or there is no economical incentive to report. Therefore reports are only written when the accidents are serious or if the fisherman is reported sick. Reports about an accident are sent to the insurance company, but another report should also be sent to the Norwegian Maritime Directorate (NMD). Comparing of data from one former insurance company and NMD shows that the real numbers of injuries or serious accidents among Norwegian fishermen could be up to two times more than the numbers reported to NMD. Special analyses of 1690 accidents from the so called PUS-database (NMD) for the period 1998-2002, show that the calculated risk was 23.6 accidents per 1000 man-years. This is quite a high risk level, and most of the accidents in the fishing fleet were rather serious. The calculated risks are highest for fishermen on board the deep sea fleet of trawlers (28.6 accidents per 1000 man-years) and also on the deep sea fleet of purse seiners (28.9 accidents per 1000 man-years). Fatal accidents over a longer period of 51.5 years from 1955 to 2006 are also roughly analysed. These data from SINTEF's own database show that the numbers of fatal accidents have been decreasing over this long period, except for the two periods 1980-84 and 1990-94 where we had some casualties with total losses of larger vessels with the loss of most of the crew, but also many others typical work accidents on smaller vessels. The total numbers of registered Norwegian fishermen and also the numbers of man-years have been drastically reduced over the 51.5 years from 1955 to 2006. The risks of fatal accidents have been very steady over time at a high level, although there has been a marked risk reduction since 1990-94. For the last 8.5-year period of January 1998

  2. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  3. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  4. Episode analysis of deposition of radiocesium from the Fukushima Daiichi nuclear power plant accident.

    PubMed

    Morino, Yu; Ohara, Toshimasa; Watanabe, Mirai; Hayashi, Seiji; Nishizawa, Masato

    2013-03-01

    Chemical transport models played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. However, model results could not be sufficiently evaluated because of limited observational data. We assess the model performance to simulate the deposition patterns of radiocesium ((137)Cs) by making use of airborne monitoring survey data for the first time. We conducted ten sensitivity simulations to evaluate the atmospheric model uncertainties associated with key model settings including emission data and wet deposition modules. We found that simulation using emissions estimated with a regional-scale (∼ 500 km) model better reproduced the observed (137)Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (∼ 50 km) or global-scale models. In addition, simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed (137)Cs deposition rates in high-deposition areas (≥ 10 kBq m(-2)) within 1 order of magnitude and showed that deposition of radiocesium over land occurred predominantly during 15-16, 20-23, and 30-31 March 2011. PMID:23391028

  5. Health effects models for nuclear power plant accident consequence analysis: Low LET radiation

    SciTech Connect

    Evans, J.S. . School of Public Health)

    1990-01-01

    This report describes dose-response models intended to be used in estimating the radiological health effects of nuclear power plant accidents. Models of early and continuing effects, cancers and thyroid nodules, and genetic effects are provided. Weibull dose-response functions are recommended for evaluating the risks of early and continuing health effects. Three potentially lethal early effects -- the hematopoietic, pulmonary, and gastrointestinal syndromes -- are considered. In addition, models are included for assessing the risks of several nonlethal early and continuing effects -- including prodromal vomiting and diarrhea, hypothyroidism and radiation thyroiditis, skin burns, reproductive effects, and pregnancy losses. Linear and linear-quadratic models are recommended for estimating cancer risks. Parameters are given for analyzing the risks of seven types of cancer in adults -- leukemia, bone, lung, breast, gastrointestinal, thyroid, and other.'' The category, other'' cancers, is intended to reflect the combined risks of multiple myeloma, lymphoma, and cancers of the bladder, kidney, brain, ovary, uterus and cervix. Models of childhood cancers due to in utero exposure are also developed. For most cancers, both incidence and mortality are addressed. The models of cancer risk are derived largely from information summarized in BEIR III -- with some adjustment to reflect more recent studies. 64 refs., 18 figs., 46 tabs.

  6. Analysis techniques used on field degraded photovoltaic modules

    SciTech Connect

    Hund, T.D.; King, D.L.

    1995-09-01

    Sandia National Laboratory`s PV System Components Department performs comprehensive failure analysis of photovoltaic modules after extended field exposure at various sites around the world. A full spectrum of analytical techniques are used to help identify the causes of degradation. The techniques are used to make solder fatigue life predictions for PV concentrator modules, identify cell damage or current mismatch, and measure the adhesive strength of the module encapsulant.

  7. An integrated technique for the analysis of skin bite marks.

    PubMed

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty. PMID:18279256

  8. Analysis of accident sequences and source terms at treatment and storage facilities for waste generated by US Department of Energy waste management operations

    SciTech Connect

    Mueller, C.; Nabelssi, B.; Roglans-Ribas, J.; Folga, S.; Policastro, A.; Freeman, W.; Jackson, R.; Mishima, J.; Turner, S.

    1996-12-01

    This report documents the methodology, computational framework, and results of facility accident analyses performed for the US Department of Energy (DOE) Waste Management Programmatic Environmental Impact Statement (WM PEIS). The accident sequences potentially important to human health risk are specified, their frequencies assessed, and the resultant radiological and chemical source terms evaluated. A personal-computer-based computational framework and database have been developed that provide these results as input to the WM PEIS for the calculation of human health risk impacts. The WM PEIS addresses management of five waste streams in the DOE complex: low-level waste (LLW), hazardous waste (HW), high-level waste (HLW), low-level mixed waste (LLMW), and transuranic waste (TRUW). Currently projected waste generation rates, storage inventories, and treatment process throughputs have been calculated for each of the waste streams. This report summarizes the accident analyses and aggregates the key results for each of the waste streams. Source terms are estimated, and results are presented for each of the major DOE sites and facilities by WM PEIS alternative for each waste stream. Key assumptions in the development of the source terms are identified. The appendices identify the potential atmospheric release of each toxic chemical or radionuclide for each accident scenario studied. They also discuss specific accident analysis data and guidance used or consulted in this report.

  9. Dynamic analysis of large structures by modal synthesis techniques.

    NASA Technical Reports Server (NTRS)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  10. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  11. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  12. Spatiotemporal Analysis for Wildlife-Vehicle Based on Accident Statistics of the County Straubing-Bogen in Lower Bavaria

    NASA Astrophysics Data System (ADS)

    Pagany, R.; Dorner, W.

    2016-06-01

    During the last years the numbers of wildlife-vehicle-collisions (WVC) in Bavaria increased considerably. Despite the statistical registration of WVC and preventive measures at areas of risk along the roads, the number of such accidents could not be contained. Using geospatial analysis on WVC data of the last five years for county Straubing-Bogen, Bavaria, a small-scale methodology was found to analyse the risk of WVC along the roads in the investigated area. Various indicators were examined, which may be related to WVC. The risk depends on the time of the day and year which shows correlations in turn to the traffic density and wildlife population. Additionally the location of the collision depends on the species and on different environmental parameters. Accidents seem to correlate with the land use left and right of the street. Land use data and current vegetation were derived from remote sensing data, providing information of the general land use, also considering the vegetation period. For this a number of hot spots was selected to identify potential dependencies between land use, vegetation and season. First results from these hotspots show, that WVCs do not only depend on land use, but may show a correlation with the vegetation period. With regard to agriculture and seasonal as well as annual changes this indicates that warnings will fail due to their static character in contrast to the dynamic situation of land use and resulting risk for WVCs. This shows that there is a demand for remote sensing data with a high spatial and temporal resolution as well as a methodology to derive WVC warnings considering land use and vegetation. With remote sensing data, it could become possible to classify land use and calculate risk levels for WVC. Additional parameters, derived from remote sensed data that could be considered are relief and crops as well as other parameters such as ponds, natural and infrastructural barriers that could be related to animal behaviour and

  13. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  14. Grid techniques in the analysis of gaseous pollutant propagation

    NASA Astrophysics Data System (ADS)

    Pisarek, Jerzy; Blaszczuk, A.

    2003-10-01

    The article describes trends in the development of gradient techniques used for the analysis of the spatial distribution of the breakdown coefficient of a gas different to the surrounding atmosphere. Depending on the modification made to the Schlieren technique, it was possible to measure in real time the mass distribution with a different range and ratio. In the optical system, periodic patterns (rasters) as well as the arrangements of Rife prisms were used. Carbon dioxide and propane were used as gaseous pollutants of the air. The new solution proposed by the authors has turned out to be an effective tool for the analysis of gaseous pollutant distribution processes.

  15. [Accidents and acts of violence in Brazil: I--Analysis of mortality data].

    PubMed

    Jorge, M H; Gawryszewski, V P; Latorre, M do R

    1997-08-01

    External causes are an important cause of death in almost all countries. They are always the second or third in the mortality ranking, but their distribution according to type varies from country to country. Mortality due to external causes by type, gender and age, for Brazil as a whole and for state capitals specifically, is analysed. Mortality rates and the proportional mortality from 1977 to 1994 were calculated. The results showed that the number of deaths due to external causes has almost doubled from 1977 to 1994 and nowadays this is the second cause of death in Brazil. The mortality rate, in 1991, was 69.8 per 100,000 inhabitants and the highest increase was in the male rates. The male rates are almost 4.5 times greater than the female ones. The first cause of death among people from 5 to 39 years old is external causes, and the majority occur between 15 and 19 years of age (65% of the deaths by external causes). Besides the growth in itself it also seems that a shift of deaths to hower ages is occurring. Both mortality by traffic accidents and that by homicide have increased over the period from 1977 to 1994. Suicides have been stable and "other external causes" have increased slowly, especially due to falls and drowning. The mortality rates for external causes in state capitals are higher than the average for Brazil as a whole, except for some northeastern capitals. The rates for the capitals in the northern region are the highest in Brazil. In the northeastern region, only Recife, Maceió and Salvador have high rates. In the southeast, Vitória, Rio de Janeiro and S. Paulo have the highest rates in the country but Belo Horizonte's rates are declining. In the southern region all the capitals showed a growth in the rates as also in the capitals of the West-central region. The growth of mortality due to external causes type of external cause is different in these capitals. Suicide is not a public health problem in Brazil nor the state capitals. Traffic

  16. Modeling and sensitivity analysis of transport and deposition of radionuclides from the Fukushima Daiichi accident

    NASA Astrophysics Data System (ADS)

    Hu, X.; Li, D.; Huang, H.; Shen, S.; Bou-Zeid, E.

    2014-01-01

    The atmospheric transport and ground deposition of radioactive isotopes 131I and 137Cs during and after the Fukushima Daiichi Nuclear Power Plant (FDNPP) accident (March 2011) are investigated using the Weather Research and Forecasting/Chemistry (WRF/Chem) model. The aim is to assess the skill of WRF in simulating these processes and the sensitivity of the model's performance to various parameterizations of unresolved physics. The WRF/Chem model is first upgraded by implementing a radioactive decay term into the advection-diffusion solver and adding three parameterizations for dry deposition and two parameterizations for wet deposition. Different microphysics and horizontal turbulent diffusion schemes are then tested for their ability to reproduce observed meteorological conditions. Subsequently, the influence on the simulated transport and deposition of the characteristics of the emission source, including the emission rate, the gas partitioning of 131I and the size distribution of 137Cs, is examined. The results show that the model can predict the wind fields and rainfall realistically. The ground deposition of the radionuclides can also potentially be captured well but it is very sensitive to the emission characterization. It is found that the total deposition is most influenced by the emission rate for both 131I and 137Cs; while it is less sensitive to the dry deposition parameterizations. Moreover, for 131I, the deposition is also sensitive to the microphysics schemes, the horizontal diffusion schemes, gas partitioning and wet deposition parameterizations; while for 137Cs, the deposition is very sensitive to the microphysics schemes and wet deposition parameterizations, and it is also sensitive to the horizontal diffusion schemes and the size distribution.

  17. ADAPT (Analysis of Dynamic Accident Progression Trees) Beta Version 0.9

    2010-01-07

    The purpose of the ADAPT code is to generate Dynamic Event Trees (DET) using a user specified simulator. ADAPT can utilize any simulation tool which meets a minimal set of requirements. ADAPT is based on the concept of DET which use explicit modeling of the deterministic dynamic processes that take place during a nuclear reactor plant system evolution along with stochastic modeling. When DET are used to model different aspects of Probabilistic Risk Assessment (PRA),more » all accident progression scenarios starting from an initiating event are considered simultaneously. The DET branching occurs at user specified times and/or when an action is required by the system and/or the operator. These outcomes then decide how the dynamic system variables will evolve in time for each DET branch. Since two different outcomes at a DET branching may lead to completely different paths for system evolution, the next branching for these paths may occur not only at different times, but can be based on different branching criteria. The computational infrastructure allows for flexibility in ADAPT to link with different system simulation codes, parallel processing of the scenarios under consideration, on-line scenario management (initiation as well as termination) and user friendly graphical capabilities. The ADAPT system is designed for a distributed computing environment; the scheduler can track multiple concurrent branches simultaneously. The scheduler is modularized so that the DET branching strategy can be modified (e.g. biasing towards the worse case scenario/event). Independent database systems store data from the simulation tasks and the DET structure so that the event tree can be constructed and analyzed later. ADAPT is provided with a user-friendly client which can easily sort through and display the results of an experiment, precluding the need for the user to manually inspect individual simulator runs.« less

  18. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  19. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-07-12

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  20. Calculation notes that support accident scenario and consequence development for the subsurface leak remaining subsurface accident

    SciTech Connect

    Ryan, G.W., Westinghouse Hanford

    1996-09-19

    This document supports the development and presentation of the following accident scenario in the TWRS Final Safety Analysis Report: Subsurface Leak Remaining Subsurface. The calculations needed to quantify the risk associated with this accident scenario are included within.

  1. Accident analysis for transuranic waste management alternatives in the U.S. Department of Energy waste management program

    SciTech Connect

    Nabelssi, B.; Mueller, C.; Roglans-Ribas, J.; Folga, S.; Tompkins, M.; Jackson, R.

    1995-03-01

    Preliminary accident analyses and radiological source term evaluations have been conducted for transuranic waste (TRUW) as part of the US Department of Energy (DOE) effort to manage storage, treatment, and disposal of radioactive wastes at its various sites. The approach to assessing radiological releases from facility accidents was developed in support of the Office of Environmental Management Programmatic Environmental Impact Statement (EM PEIS). The methodology developed in this work is in accordance with the latest DOE guidelines, which consider the spectrum of possible accident scenarios in the implementation of various actions evaluated in an EIS. The radiological releases from potential risk-dominant accidents in storage and treatment facilities considered in the EM PEIS TRUW alternatives are described in this paper. The results show that significant releases can be predicted for only the most severe and extremely improbable accidents sequences.

  2. Supplemental analysis of accident sequences and source terms for waste treatment and storage operations and related facilities for the US Department of Energy waste management programmatic environmental impact statement

    SciTech Connect

    Folga, S.; Mueller, C.; Nabelssi, B.; Kohout, E.; Mishima, J.

    1996-12-01

    This report presents supplemental information for the document Analysis of Accident Sequences and Source Terms at Waste Treatment, Storage, and Disposal Facilities for Waste Generated by US Department of Energy Waste Management Operations. Additional technical support information is supplied concerning treatment of transuranic waste by incineration and considering the Alternative Organic Treatment option for low-level mixed waste. The latest respirable airborne release fraction values published by the US Department of Energy for use in accident analysis have been used and are included as Appendix D, where respirable airborne release fraction is defined as the fraction of material exposed to accident stresses that could become airborne as a result of the accident. A set of dominant waste treatment processes and accident scenarios was selected for a screening-process analysis. A subset of results (release source terms) from this analysis is presented.

  3. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  4. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  5. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  6. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  7. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  8. Analysis of leaching data using asymptotic expansion techniques

    SciTech Connect

    Simonson, S.A.; Machiels, A.J.

    1983-01-01

    Asymptotic analysis constitutes a useful technique to determine the adjustable parameters appearing in mathematical models attempting to reproduce some experimental data. In particular, asymptotic expansions of a leach model proposed by A.J. Machiels and C. Pescatore are used to interpret leaching data from PNL 76-68 glass in terms of corrosion velocities and diffusion coefficients.

  9. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  10. Laser-based flow cytometric analysis of genotoxicity of humans exposed to ionizing radiation during the Chernobyl accident

    SciTech Connect

    Jensen, R.H.; Bigbee, W.L.; Langlois, R.G.; Grant, S.G. ); Pleshanov, P.G. ); Chirkov, A.A. ); Pilinskaya, M.A. )

    1990-09-12

    An analytical technique has been developed that allows laser-based flow cytometric measurement of the frequency of red blood cells that have lost allele-specific expression of a cell surface antigen due to genetic toxicity in bone marrow precursor cells. Previous studies demonstrated a correlation of such effects with the exposure of each individual to mutagenic phenomena, such as ionizing radiation, and the effects can persist for the lifetime of each individual. During the emergency response to the nuclear power plant accident at Chernobyl, Ukraine, USSR, a number of people were exposed to whole body doses of ionizing radiation. Some of these individuals were tested with this laser-based assay and found to express a dose-dependent increase in the frequency of variant red blood cells that appears to be a persistent biological effect. All data indicate that this assay might well be used as a biodosimeter to estimate radiation dose and also as an element to be used for estimating the risk of each individual to develop cancer due to radiation exposure. 17 refs., 5 figs.

  11. Weather types and traffic accidents.

    PubMed

    Klaić, Z B

    2001-06-01

    Traffic accident data for the Zagreb area for the 1981-1982 period were analyzed to investigate possible relationships between the daily number of accidents and the weather conditions that occurred for the 5 consecutive days, starting two days before the particular day. In the statistical analysis of low accident days weather type classification developed by Poje was used. For the high accident days a detailed analyses of surface and radiosonde data were performed in order to identify possible front passages. A test for independence by contingency table confirmed that conditional probability of the day with small number of accidents is the highest, provided that one day after it "N" or "NW" weather types occur, while it is the smallest for "N1" and "Bc" types. For the remaining 4 days of the examined periods dependence was not statistically confirmed. However, northern ("N", "NE" and "NW") and anticyclonic ("Vc", "V4", "V3", "V2" and "mv") weather types predominated during 5-days intervals related to the days with small number of accidents. On the contrary, the weather types with cyclonic characteristics ("N1", "N2", "N3", "Bc", "Dol1" and "Dol"), that are generally accompanied by fronts, were the rarest. For 85% days with large number of accidents, which had not been caused by objective circumstances (such as poor visibility, damaged or slippery road etc.), at least one front passage was recorded during the 3-days period, starting one day before the day with large number of accidents. PMID:11787547

  12. [Safety assessment of a Brazilian company based on analysis of work accidents by the causal tree method].

    PubMed

    Binder, M C; Pham, D; de Almeida, I M

    1998-01-01

    We present here the results of a study of 21 work-related accidents that occurred in a Brazilian manufacturing company. The aim was to assess the safety level of the company to improve its work accident prevention policy. In the last 6 months of 1992 and 1993, all accidents resulting in 15 days' absence from work, reported for social security purposes, were analyzed using the INRS causal tree method (ADC) and a questionnaire completed on site. Potential risk factors for accidents were identified based on the specific factors highlighted by the ADC. More universal trees were also compiled for the safety assessment. Three hundred and thirty specific accident factors were recorded (mean of 15.71 per accident). This is consistent with there being multiple causes of accidents rather than the assertion of Brazilian business safety departments that accidents are due to "dangerous" or "unsafe" behavior. Introducing the idea of culpability into accidents prevents the implementation of an appropriate information feedback process, essential for effective prevention. However, the large number of accidents related to "material" (78%) and "environment" (70%) indicates that working conditions are poor. This shows that the technical risks, mostly due to unsafe machinery and equipment are not being dealt with. Seventy-five potential accident factors were identified. Of these, 35% were "organizational", a high proportion for the company studied. Improvisation occurs at all levels, particularly at the organizational level. This is thus a major determinant for entire series of, if not most, accident situations. The poor condition of equipment also plays a major role in accidents. The effects of poor equipment on safety exacerbate the organizational shortcomings. The company's safety intervention policy should improve the management of human resources (rules designating particular workers for particular workstations; instructions for the safe operation of machines and equipment

  13. Health effects model for nuclear power plant accident consequence analysis. Part I. Introduction, integration, and summary. Part II. Scientific basis for health effects models

    SciTech Connect

    Evans, J.S.; Moeller, D.W.; Cooper, D.W.

    1985-07-01

    Analysis of the radiological health effects of nuclear power plant accidents requires models for predicting early health effects, cancers and benign thyroid nodules, and genetic effects. Since the publication of the Reactor Safety Study, additional information on radiological health effects has become available. This report summarizes the efforts of a program designed to provide revised health effects models for nuclear power plant accident consequence modeling. The new models for early effects address four causes of mortality and nine categories of morbidity. The models for early effects are based upon two parameter Weibull functions. They permit evaluation of the influence of dose protraction and address the issue of variation in radiosensitivity among the population. The piecewise-linear dose-response models used in the Reactor Safety Study to predict cancers and thyroid nodules have been replaced by linear and linear-quadratic models. The new models reflect the most recently reported results of the follow-up of the survivors of the bombings of Hiroshima and Nagasaki and permit analysis of both morbidity and mortality. The new models for genetic effects allow prediction of genetic risks in each of the first five generations after an accident and include information on the relative severity of various classes of genetic effects. The uncertainty in modeloling radiological health risks is addressed by providing central, upper, and lower estimates of risks. An approach is outlined for summarizing the health consequences of nuclear power plant accidents. 298 refs., 9 figs., 49 tabs.

  14. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    NASA Astrophysics Data System (ADS)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  15. Intestinal Preparation Techniques for Histological Analysis in the Mouse.

    PubMed

    Williams, Jonathan M; Duckworth, Carrie A; Vowell, Kate; Burkitt, Michael D; Pritchard, D Mark

    2016-01-01

    The murine intestinal tract represents a difficult organ system to study due to its long convoluted tubular structure, narrow diameter, and delicate mucosa which undergoes rapid changes after sampling prior to fixation. These features do not make for easy histological analysis as rapid fixation in situ, or after simple removal without careful dissection, results in poor postfixation tissue handling and limited options for high quality histological sections. Collecting meaningful quantitative data by analysis of this tissue is further complicated by the anatomical changes in structure along its length. This article describes two methods of intestinal sampling at necropsy that allow systematic histological analysis of the entire intestinal tract, either through examination of cross sections (circumferences) by the gut bundling technique or longitudinal sections by the adapted Swiss roll technique, together with basic methods for data collection. © 2016 by John Wiley & Sons, Inc. PMID:27248432

  16. Hyphenated techniques for the analysis of heparin and heparan sulfate

    PubMed Central

    Yang, Bo; Solakyildirim, Kemal; Chang, Yuqing

    2011-01-01

    The elucidation of the structure of glycosaminoglycan has proven to be challenging for analytical chemists. Molecules of glycosaminoglycan have a high negative charge and are polydisperse and microheterogeneous, thus requiring the application of multiple analytical techniques and methods. Heparin and heparan sulfate are the most structurally complex of the glycosaminoglycans and are widely distributed in nature. They play critical roles in physiological and pathophysiological processes through their interaction with heparin-binding proteins. Moreover, heparin and low-molecular weight heparin are currently used as pharmaceutical drugs to control blood coagulation. In 2008, the health crisis resulting from the contamination of pharmaceutical heparin led to considerable attention regarding their analysis and structural characterization. Modern analytical techniques, including high-performance liquid chromatography, capillary electrophoresis, mass spectrometry, and nuclear magnetic resonance spectroscopy, played critical roles in this effort. A successful combination of separation and spectral techniques will clearly provide a critical advantage in the future analysis of heparin and heparan sulfate. This review focuses on recent efforts to develop hyphenated techniques for the analysis of heparin and heparan sulfate. PMID:20853165

  17. Fire accident analysis modeling in support of non-reactor nuclear facilities at Sandia National Laboratories

    SciTech Connect

    Restrepo, L.F.

    1993-06-01

    The Department of Energy (DOE) requires that fire hazard analyses (FHAs) be conducted for all nuclear and new facilities, with results for the latter incorporated into Title I design. For those facilities requiring safety analysis documentation, the FHA shall be documented in the Safety Analysis Reports (SARs). This paper provides an overview of the methodologies and codes being used to support FHAs at Sandia facilities, with emphasis on SARs.

  18. Architectural stability analysis of the rotary-laser scanning technique

    NASA Astrophysics Data System (ADS)

    Xue, Bin; Yang, Xiaoxia; Zhu, Jigui

    2016-03-01

    The rotary-laser scanning technique is an important method in scale measurements due to its high accuracy and large measurement range. This paper first introduces a newly designed measurement station which is able to provide two-dimensional measurement information including the azimuth and elevation by using the rotary-laser scanning technique, then presents the architectural stability analysis of this technique by detailed theoretical derivations. Based on the designed station, a validation using both experiment and simulation is presented in order to verify the analytic conclusion. The results show that the architectural stability of the rotary-laser scanning technique is only affected by the two scanning angles' difference. And the difference which brings the best architectural stability can be calculated by using pre-calibrated parameters of the two laser planes. This research gives us an insight into the rotary-laser scanning technique. Moreover, the measurement accuracy of the rotary-laser scanning technique can be further improved based on the results of the study.

  19. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  20. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    SciTech Connect

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  1. Large areas elemental mapping by ion beam analysis techniques

    NASA Astrophysics Data System (ADS)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  2. Radiation accidents.

    PubMed

    Saenger, E L

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity. PMID:3526994

  3. Radiation accidents

    SciTech Connect

    Saenger, E.L.

    1986-09-01

    It is essential that emergency physicians understand ways to manage patients contaminated by radioactive materials and/or exposed to external radiation sources. Contamination accidents require careful surveys to identify the metabolic pathway of the radionuclides to guide prognosis and treatment. The level of treatment required will depend on careful surveys and meticulous decontamination. There is no specific therapy for the acute radiation syndrome. Prophylactic antibodies are desirable. For severely exposed patients treatment is similar to the supportive care given to patients undergoing organ transplantation. For high-dose extremity injury, no methods have been developed to reverse the fibrosing endarteritis that eventually leads to tissue death so frequently found with this type of injury. Although the Three Mile Island episode of March 1979 created tremendous public concern, there were no radiation injuries. The contamination outside the reactor building and the release of radioiodine were negligible. The accidental fuel element meltdown at Chernobyl, USSR, resulted in many cases of acute radiation syndrome. More than 100,000 people were exposed to high levels of radioactive fallout. The general principles outlined here are applicable to accidents of that degree of severity.

  4. Oranges and Peaches: Understanding Communication Accidents in the Reference Interview.

    ERIC Educational Resources Information Center

    Dewdney, Patricia; Michell, Gillian

    1996-01-01

    Librarians often have communication "accidents" with reference questions as initially presented. This article presents linguistic analysis of query categories, including: simple failures of hearing, accidents involving pronunciation or homophones, accidents where users repeat earlier misinterpretations to librarians, and accidents where users…

  5. Small area analysis using micro-diffraction techniques

    SciTech Connect

    GOEHNER,RAYMOND P.; TISSOT JR.,RALPH G.; MICHAEL,JOSEPH R.

    2000-02-11

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 {micro}m to 100 {micro}m. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30{micro}m glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has

  6. Preliminary Accident Analysis for Construction and Operation of the Chornobyl New Safety Confinement

    SciTech Connect

    Batiy, Valeriy; Rubezhansky, Yruiy; Rudko, Vladimir; shcherbin, vladimir; Yegorov, V; Schmieman, Eric A.; Timmins, Douglas C.

    2005-08-08

    Analysis of potential exposure of personal and population during construction and exploitation of the New Safe Confinement was made. Scenarios of hazard event development were ranked. It is shown, that as a whole construction and exploitation of the NSC are in accordance with actual radiation safety norms of Ukraine.

  7. Temporal uncertainty analysis of human errors based on interrelationships among multiple factors: a case of Minuteman III missile accident.

    PubMed

    Rong, Hao; Tian, Jin; Zhao, Tingdi

    2016-01-01

    In traditional approaches of human reliability assessment (HRA), the definition of the error producing conditions (EPCs) and the supporting guidance are such that some of the conditions (especially organizational or managerial conditions) can hardly be included, and thus the analysis is burdened with incomprehensiveness without reflecting the temporal trend of human reliability. A method based on system dynamics (SD), which highlights interrelationships among technical and organizational aspects that may contribute to human errors, is presented to facilitate quantitatively estimating the human error probability (HEP) and its related variables changing over time in a long period. Taking the Minuteman III missile accident in 2008 as a case, the proposed HRA method is applied to assess HEP during missile operations over 50 years by analyzing the interactions among the variables involved in human-related risks; also the critical factors are determined in terms of impact that the variables have on risks in different time periods. It is indicated that both technical and organizational aspects should be focused on to minimize human errors in a long run. PMID:26360211

  8. TRUMP-BD: A computer code for the analysis of nuclear fuel assemblies under severe accident conditions

    SciTech Connect

    Lombardo, N.J.; Marseille, T.J.; White, M.D.; Lowery, P.S.

    1990-06-01

    TRUMP-BD (Boil Down) is an extension of the TRUMP (Edwards 1972) computer program for the analysis of nuclear fuel assemblies under severe accident conditions. This extension allows prediction of the heat transfer rates, metal-water oxidation rates, fission product release rates, steam generation and consumption rates, and temperature distributions for nuclear fuel assemblies under core uncovery conditions. The heat transfer processes include conduction in solid structures, convection across fluid-solid boundaries, and radiation between interacting surfaces. Metal-water reaction kinetics are modeled with empirical relationships to predict the oxidation rates of steam-exposed Zircaloy and uranium metal. The metal-water oxidation models are parabolic in form with an Arrhenius temperature dependence. Uranium oxidation begins when fuel cladding failure occurs; Zircaloy oxidation occurs continuously at temperatures above 13000{degree}F when metal and steam are available. From the metal-water reactions, the hydrogen generation rate, total hydrogen release, and temporal and spatial distribution of oxide formations are computed. Consumption of steam from the oxidation reactions and the effect of hydrogen on the coolant properties is modeled for independent coolant flow channels. Fission product release from exposed uranium metal Zircaloy-clad fuel is modeled using empirical time and temperature relationships that consider the release to be subject to oxidation and volitization/diffusion ( bake-out'') release mechanisms. Release of the volatile species of iodine (I), tellurium (Te), cesium (Ce), ruthenium (Ru), strontium (Sr), zirconium (Zr), cerium (Cr), and barium (Ba) from uranium metal fuel may be modeled.

  9. Analysis of Sodium Fire in the Containment Building of Prototype Fast Breeder Reactor Under the Scenario of Core Disruptive Accident

    SciTech Connect

    Rao, P.M.; Kasinathan, N.; Kannan, S.E.

    2006-07-01

    The potential for sodium release to reactor containment building from reactor assembly during Core Disruptive Accident (CDA) in Fast Breeder Reactors (FBR) is an important safety issue with reference to the structural integrity of Reactor Containment Building (RCB). For Prototype Fast Breeder Reactor (PFBR), the estimated sodium release under a CDA of 100 MJ energy release is 350 kg. The ejected sodium reacts easily with air in RCB and causes temperature and pressure rise in the RCB. For estimating the severe thermal consequences in RCB, different modes of sodium fires like pool and spray fires were analyzed by using SOFIRE -- II and NACOM sodium fire computer codes. Effects of important parameters like amount of sodium, area of pool, containment air volume and oxygen concentration have been investigated. A peak pressure rise of 7.32 kPa is predicted by SOFIRE II code for 350 kg sodium pool fire in 86,000 m{sup 3} RCB volume. Under sodium release as spray followed by unburnt sodium as pool fire mode analysis, the estimated pressure rise is 5.85 kPa in the RCB. In the mode of instantaneous combustion of sodium, the estimated peak pressure rise is 13 kPa. (authors)

  10. Kinetics Parameters of VVER-1000 Core with 3 MOX Lead Test Assemblies To Be Used for Accident Analysis Codes

    SciTech Connect

    Pavlovitchev, A.M.

    2000-03-08

    The present work is a part of Joint U.S./Russian Project with Weapons-Grade Plutonium Disposition in VVER Reactor and presents the neutronics calculations of kinetics parameters of VVER-1000 core with 3 introduced MOX LTAs. MOX LTA design has been studied in [1] for two options of MOX LTA: 100% plutonium and of ''island'' type. As a result, zoning i.e. fissile plutonium enrichments in different plutonium zones, has been defined. VVER-1000 core with 3 introduced MOX LTAs of chosen design has been calculated in [2]. In present work, the neutronics data for transient analysis codes (RELAP [3]) has been obtained using the codes chain of RRC ''Kurchatov Institute'' [5] that is to be used for exploitation neutronics calculations of VVER. Nowadays the 3D assembly-by-assembly code BIPR-7A and 2D pin-by-pin code PERMAK-A, both with the neutronics constants prepared by the cell code TVS-M, are the base elements of this chain. It should be reminded that in [6] TVS-M was used only for the constants calculations of MOX FAs. In current calculations the code TVS-M has been used both for UOX and MOX fuel constants. Besides, the volume of presented information has been increased and additional explications have been included. The results for the reference uranium core [4] are presented in Chapter 2. The results for the core with 3 MOX LTAs are presented in Chapter 3. The conservatism that is connected with neutronics parameters and that must be taken into account during transient analysis calculations, is discussed in Chapter 4. The conservative parameters values are considered to be used in 1-point core kinetics models of accident analysis codes.

  11. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  12. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  13. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  14. Refinement of Techniques Metallographic Analysis of Highly Dispersed Structures

    NASA Astrophysics Data System (ADS)

    Khammatov, A.; Belkin, D.; Barbina, N.

    2016-01-01

    Flaws are regularly made while developing standards and technical specifications. They can come out as minor misprints, as an insufficient description of a technique. In spite the fact that the flaws are well known, it does not come to the stage of introducing changes to standards. In this paper shows that in the normative documents is necessary to clarify the requirements for metallurgical microscopes, which are used for analysis of finely-dispersed.

  15. Scalable Analysis Techniques for Microprocessor Performance Counter Metrics

    SciTech Connect

    Ahn, D H; Vetter, J S

    2002-07-24

    Contemporary microprocessors provide a rich set of integrated performance counters that allow application developers and system architects alike the opportunity to gather important information about workload behaviors. These counters can capture instruction, memory, and operating system behaviors. Current techniques for analyzing data produced from these counters use raw counts, ratios, and visualization techniques to help users make decisions about their application source code. While these techniques are appropriate for analyzing data from one process, they do not scale easily to new levels demanded by contemporary computing systems. Indeed, the amount of data generated by these experiments is on the order of tens of thousands of data points. Furthermore, if users execute multiple experiments, then we add yet another dimension to this already knotty picture. This flood of multidimensional data can swamp efforts to harvest important ideas from these valuable counters. Very simply, this paper addresses these concerns by evaluating several multivariate statistical techniques on these datasets. We find that several techniques, such as statistical clustering, can automatically extract important features from this data. These derived results can, in turn, be feed directly back to an application developer, or used as input to a more comprehensive performance analysis environment, such as a visualization or an expert system.

  16. Analysis of diagnostic calorimeter data by the transfer function technique

    NASA Astrophysics Data System (ADS)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  17. Analysis of diagnostic calorimeter data by the transfer function technique.

    PubMed

    Delogu, R S; Poggi, C; Pimazzoni, A; Rossi, G; Serianni, G

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing. PMID:26932104

  18. Preliminary assessment of aerial photography techniques for canvasback population analysis

    USGS Publications Warehouse

    Munro, R.E.; Trauger, D.L.

    1976-01-01

    Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.

  19. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    NASA Astrophysics Data System (ADS)

    Singh Duksh, Yograj; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-05-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE.

  20. Comparing the Identification of Recommendations by Different Accident Investigators Using a Common Methodology

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Oltedal, H. A.; Holloway, C. M.

    2012-01-01

    Accident reports play a key role in the safety of complex systems. These reports present the recommendations that are intended to help avoid any recurrence of past failures. However, the value of these findings depends upon the causal analysis that helps to identify the reasons why an accident occurred. Various techniques have been developed to help investigators distinguish root causes from contributory factors and contextual information. This paper presents the results from a study into the individual differences that can arise when a group of investigators independently apply the same technique to identify the causes of an accident. This work is important if we are to increase the consistency and coherence of investigations following major accidents.

  1. A working man`s analysis of incidents and accidents with explosives at the Los Alamos National Laboratory, 1946--1997

    SciTech Connect

    Ramsay, J.B.; Goldie, R.H.

    1998-12-31

    At the inception of the Laboratory hectic and intense work was the norm during the development of the atomic bombs. After the war the development of other weapons for the Cold War again contributed to an intense work environment. Formal Standard Operating Procedures (SOPs) were not required at that time. However, the occurrence of six fatalities in 1959 during the development of a new high-energy plastic bonded explosive (94% HMX) forced the introduction SOPs. After an accident at the Department of Energy (DOE) plant at Amarillo, TX in 1977, the DOE promulgated the Department wide DOE Explosives Safety Manual. Table 1 outlines the history of the introduction of SOPs and the DOE Explosives Safety Manual. Many of the rules and guidelines presented in these documents were developed and introduced as the result of an incident or accident. However, many of the current staff are not familiar with the background of the development. To preserve as much of this knowledge as possible, they are collecting documentation on incidents and accidents involving energetic materials at Los Alamos. Formal investigations of serious accidents elucidate the multiple causes that contributed to accidents. These reports are generally buried in a file and, and are not read by more recent workers. Reports involving fatalities at Los Alamos before 1974 were withheld from the general employee. Also, these documents contain much detail and analysis that is not of interest to the field worker. The authors have collected the documents describing 116 incidents and have analyzed the contributing factors as viewed from the standpoint of the individual operator. All the incidents occurred at the Los Alamos National Laboratory and involved energetic materials in some manner, though not all occurred within the explosive handling groups. Most accidents are caused by multiple contributing factors. They have attempted to select the one or two factors that they consider as the most important relative to the

  2. Source term and radiological consequences of the Chernobyl accident

    SciTech Connect

    Mourad, R.; Snell, V.

    1987-01-01

    The objective of this work is to assess the source term and to evaluate the maximum hypothetical individual doses in European countries (including the Soviet Union) from the Chernobyl accident through the analyses of measurements of meteorological data, radiation fields, and airborne and deposited activity in these countries. Applying this information to deduce the source term involves a reversal of the techniques of nuclear accident analysis, which estimate the off-site consequences of postulated accidents. In this study the authors predict the quantities of radionuclides that, if released at Chernobyl and following the calculated trajectories, would explain and unify the observed radiation levels and radionuclide concentrations as measured by European countries and the Soviet Union. The simulation uses the PEAR microcomputer program following the methodology described in Canadian Standards Association standard N288.2. The study was performed before the Soviets published their estimate of the source term and the two results are compared.

  3. Analysis of avalanche risk factors in backcountry terrain based on usage frequency and accident data in Switzerland

    NASA Astrophysics Data System (ADS)

    Techel, F.; Zweifel, B.; Winkler, K.

    2015-09-01

    Recreational activities in snow-covered mountainous terrain in the backcountry account for the vast majority of avalanche accidents. Studies analyzing avalanche risk mostly rely on accident statistics without considering exposure (or the elements at risk), i.e., how many, when and where people are recreating, as data on recreational activity in the winter mountains are scarce. To fill this gap, we explored volunteered geographic information on two social media mountaineering websites - bergportal.ch and camptocamp.org. Based on these data, we present a spatiotemporal pattern of winter backcountry touring activity in the Swiss Alps and compare this with accident statistics. Geographically, activity was concentrated in Alpine regions relatively close to the main Swiss population centers in the west and north. In contrast, accidents occurred equally often in the less-frequented inner-alpine regions. Weekends, weather and avalanche conditions influenced the number of recreationists, while the odds to be involved in a severe avalanche accident did not depend on weekends or weather conditions. However, the likelihood of being involved in an accident increased with increasing avalanche danger level, but also with a more unfavorable snowpack containing persistent weak layers (also referred to as an old snow problem). In fact, the most critical situation for backcountry recreationists and professionals occurred on days and in regions when both the avalanche danger was critical and when the snowpack contained persistent weak layers. The frequently occurring geographical pattern of a more unfavorable snowpack structure also explains the relatively high proportion of accidents in the less-frequented inner-alpine regions. These results have practical implications: avalanche forecasters should clearly communicate the avalanche danger and the avalanche problem to the backcountry user, particularly if persistent weak layers are of concern. Professionals and recreationists, on the

  4. Manifold learning techniques for the analysis of hyperspectral ocean data

    NASA Astrophysics Data System (ADS)

    Gillis, David; Bowles, Jeffrey; Lamela, Gia M.; Rhea, William J.; Bachmann, Charles M.; Montes, Marcos; Ainsworth, Tom

    2005-06-01

    A useful technique in hyperspectral data analysis is dimensionality reduction, which replaces the original high dimensional data with low dimensional representations. Usually this is done with linear techniques such as linear mixing or principal components (PCA). While often useful, there is no a priori reason for believing that the data is actually linear. Lately there has been renewed interest in modeling high dimensional data using nonlinear techniques such as manifold learning (ML). In ML, the data is assumed to lie on a low dimensional, possibly curved surface (or manifold). The goal is to discover this manifold and therefore find the best low dimensional representation of the data. Recently, researchers at the Naval Research Lab have begun to model hyperspectral data using ML. We continue this work by applying ML techniques to hyperspectral ocean water data. We focus on water since there are underlying physical reasons for believing that the data lies on a certain type of nonlinear manifold. In particular, ocean data is influenced by three factors: the water parameters, the bottom type, and the depth. For fixed water and bottom types, the spectra that arise by varying the depth will lie on a nonlinear, one dimensional manifold (i.e. a curve). Generally, water scenes will contain a number of different water and bottom types, each combination of which leads to a distinct curve. In this way, the scene may be modeled as a union of one dimensional curves. In this paper, we investigate the use of manifold learning techniques to separate the various curves, thus partitioning the scene into homogeneous areas. We also discuss ways in which these techniques may be able to derive various scene characteristics such as bathymetry.

  5. A probabilistic analysis of a catastrophic transuranic waste hoist accident at the WIPP

    SciTech Connect

    Greenfield, M.A. |; Sargent, T.J. |

    1993-06-01

    This report builds upon the extensive and careful analyses made by the DOE of the probability of failure of the waste hoist, and more particularly on the probability of failure of a major component, the hydraulic brake system. The extensive fault tree analysis prepared by the DOE was the starting point of the present report. A key element of this work is the use of probability distributions rather than so-called point estimates to describe the probability of failure of an element. One of the authors (MAG) developed the expressions for the probability of failure of the brake system. The second author (TJS) executed the calculations of the final expressions for failure probabilities. The authors hope that this work will be of use to the DOE in its evaluation of the safety of the waste hoist, a key element at the WIPP.

  6. Factors Associated with Fatal Occupational Accidents among Mexican Workers: A National Analysis

    PubMed Central

    Gonzalez-Delgado, Mery; Gómez-Dantés, Héctor; Fernández-Niño, Julián Alfredo; Robles, Eduardo; Borja, Víctor H.; Aguilar, Miriam

    2015-01-01

    Objective To identify the factors associated with fatal occupational injuries in Mexico in 2012 among workers affiliated with the Mexican Social Security Institute. Methods Analysis of secondary data using information from the National Occupational Risk Information System, with the consequence of the occupational injury (fatal versus non-fatal) as the response variable. The analysis included 406,222 non-fatal and 1,140 fatal injuries from 2012. The factors associated with the lethality of the injury were identified using a logistic regression model with the Firth approach. Results Being male (OR=5.86; CI95%: 4.22-8.14), age (OR=1.04; CI95%: 1.03-1.06), employed in the position for 1 to 10 years (versus less than 1 year) (OR=1.37; CI95%: 1.15-1.63), working as a facilities or machine operator or assembler (OR: 3.28; CI95%: 2.12- 5.07) and being a worker without qualifications (OR=1.96; CI95%: 1.18-3.24) (versus an office worker) were associated with fatality in the event of an injury. Additionally, companies classified as maximum risk (OR=1.90; CI 95%: 1.38-2.62), workplace conditions (OR=7.15; CI95%: 3.63-14.10) and factors related to the work environment (OR=9.18; CI95%:4.36-19.33) were identified as risk factors for fatality in the event of an occupational injury. Conclusions Fatality in the event of an occupational injury is associated with factors related to sociodemographics (age, sex and occupation), the work environment and workplace conditions. Worker protection policies should be created for groups with a higher risk of fatal occupational injuries in Mexico. PMID:25790063

  7. The analysis of unsteady wind turbine data using wavelet techniques

    SciTech Connect

    Slepski, J.E.; Kirchhoff, R.H.

    1995-09-01

    Wavelet analysis employs a relatively new technique which decomposes a signal into wavelets of finite length. A wavelet map is generated showing the distribution of signal variance in both the time and frequency domain. The first section of this paper begins with an introduction to wavelet theory, contrasting it to standard Fourier analysis. Some simple applications to the processing of harmonic signals are then given. Since wind turbines operate under unsteady stochastic loads, the time series of most machine parameters are non-stationary; wavelet analysis can be applied to this problem. In the second section of this paper, wavelet methods are used to examine data from Phase 2 of the NREL Combined Experiment. Data analyzed includes airfoil surface pressure, and low speed shaft torque. In each case the wavelet map offers valuable insight that could not be made without it.

  8. Detection of arterial disorders by spectral analysis techniques.

    PubMed

    Ubeyli, Elif Derya

    2007-01-01

    This paper intends to an integrated view of the spectral analysis techniques in the detection of arterial disorders. The paper includes illustrative information about feature extraction from signals recorded from arteries. Short-time Fourier transform (STFT) and wavelet transform (WT) were used for spectral analysis of ophthalmic arterial (OA) Doppler signals. Using these spectral analysis methods, the variations in the shape of the Doppler spectra as a function of time were presented in the form of sonograms in order to obtain medical information. These sonograms were then used to compare the applied methods in terms of their frequency resolution and the effects in determination of OA stenosis. The author suggest that the content of the paper will assist to the people in gaining a better understanding of the STFT and WT in the detection of arterial disorders. PMID:17502695

  9. Analysis techniques for two-dimensional infrared data

    NASA Technical Reports Server (NTRS)

    Winter, E. M.; Smith, M. C.

    1978-01-01

    In order to evaluate infrared detection and remote sensing systems, it is necessary to know the characteristics of the observational environment. For both scanning and staring sensors, the spatial characteristics of the background may be more of a limitation to the performance of a remote sensor than system noise. This limitation is the so-called spatial clutter limit and may be important for systems design of many earth application and surveillance sensors. The data used in this study is two dimensional radiometric data obtained as part of the continuing NASA remote sensing programs. Typical data sources are the Landsat multi-spectral scanner (1.1 micrometers), the airborne heat capacity mapping radiometer (10.5 - 12.5 micrometers) and various infrared data sets acquired by low altitude aircraft. Techniques used for the statistical analysis of one dimensional infrared data, such as power spectral density (PSD), exceedance statistics, etc. are investigated for two dimensional applicability. Also treated are two dimensional extensions of these techniques (2D PSD, etc.), and special techniques developed for the analysis of 2D data.

  10. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  11. New acquisition techniques and statistical analysis of bubble size distributions

    NASA Astrophysics Data System (ADS)

    Proussevitch, A.; Sahagian, D.

    2005-12-01

    Various approaches have been taken to solve the long-standing problem of determining size distributions of objects embedded in an opaque medium. In the case of vesicles in volcanic rocks, the most reliable technique is 3-D imagery by computed X-Ray tomography. However, this method is expensive, requires intensive computational resources and thus limited and not always available for an investigator. As a cheaper alternative, 2-D cross-sectional data is commonly available, but requires stereological analysis for 3-D conversion. A stereology technique for spherical bubbles is quite robust but elongated non-spherical bubbles require complicated conversion approaches and large observed populations. We have revised computational schemes of applying non-spherical stereology for practical analysis of bubble size distributions. The basic idea of this new approach is to exclude from the conversion those classes (bins) of non-spherical bubbles that provide a larger cross-section probability distribution than a maximum value which depends on mean aspect ratio. Thus, in contrast to traditional stereological techniques, larger bubbles are "predicted" from the rest of the population. As a proof of principle, we have compared distributions so obtained with direct 3-D imagery (X-Ray tomography) for non-spherical bubbles from the same samples of vesicular basalts collected from the Colorado Plateau. The results of the comparison demonstrate that in cases where x-ray tomography is impractical, stereology can be used with reasonable reliability, even for non-spherical vesicles.

  12. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    SciTech Connect

    Keselman, Dmitry; Tompkins, George H; Leishman, Deborah A

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  13. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  14. Emulation and Sobol' sensitivity analysis of an atmospheric dispersion model applied to the Fukushima nuclear accident

    NASA Astrophysics Data System (ADS)

    Girard, Sylvain; Mallet, Vivien; Korsakissok, Irène; Mathieu, Anne

    2016-04-01

    Simulations of the atmospheric dispersion of radionuclides involve large uncertainties originating from the limited knowledge of meteorological input data, composition, amount and timing of emissions, and some model parameters. The estimation of these uncertainties is an essential complement to modeling for decision making in case of an accidental release. We have studied the relative influence of a set of uncertain inputs on several outputs from the Eulerian model Polyphemus/Polair3D on the Fukushima case. We chose to use the variance-based sensitivity analysis method of Sobol'. This method requires a large number of model evaluations which was not achievable directly due to the high computational cost of Polyphemus/Polair3D. To circumvent this issue, we built a mathematical approximation of the model using Gaussian process emulation. We observed that aggregated outputs are mainly driven by the amount of emitted radionuclides, while local outputs are mostly sensitive to wind perturbations. The release height is notably influential, but only in the vicinity of the source. Finally, averaging either spatially or temporally tends to cancel out interactions between uncertain inputs.

  15. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    SciTech Connect

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  16. Organisational accidents investigation methodology and lessons learned.

    PubMed

    Dien, Yves; Llory, Michel; Montmayeul, René

    2004-07-26

    The purpose of this paper is to reflect on accident analysis methods. As the understanding of industrial accidents and incidents has evolved, they are no longer considered as the sole product of human and/or technical failures but also as originating in an unfavourable organisational context. After presenting some theoretical developments which are responsible for this evolution, we will propose two examples of organisational accidents and incidents. We will then present some properties of organisational accidents, and we will focus on some "accident-generating" organisational factors. The definition of these factors comes from an empirical approach to event analysis. Finally, we will briefly present their implications for accident and incident analysis. PMID:15231360

  17. Recording and analysis techniques for high-frequency oscillations

    PubMed Central

    Worrell, G.A.; Jerbi, K.; Kobayashi, K.; Lina, J.M.; Zelmann, R.; Le Van Quyen, M.

    2013-01-01

    In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, high-frequency oscillations (HFO) can be recorded in human partial epilepsy. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings depends on the development of new data mining techniques to extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of HFO and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals, and potentially productive future directions. PMID:22420981

  18. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  19. Accidents in the aluminium smelting industry.

    PubMed

    Das, B C; Chaudhury, S

    1995-01-01

    Analysis of the accident records of an aluminium smelting industry, covering about 2,100 employees, over a period of three years, showed a total of 465 accidents of male employees. Out of these, 5 were fatal, 40.86% were from contacts with extreme temperatures, causing burn injury to 42.58%. Hot materials were the agents causing 44.52% of the burn injuries. Molten aluminium constituted 43.96% amongst hot materials. Injury to lower limbs constituted 38.71% and that to upper limbs 36.99%. The accidents occurring to the employees, in the age group of 26-33 years, amounted to 61.72% of the total accidents. The average number of man-days lost per year was 11,153. Average frequency rate of accidents was 30.75 accidents per million man-hours worked. Severity rate of accidents was 2.196 per million man-hours worked. Incident rate per thousand employees was 73.81. Average number of days lost per accidents was 71.95 days and average duration of man-hours between accidents was 32,516. Mean age of the employees, who met with the accidents were 29.53 years. Share of accidents in the second half of each shift was always more than that in the first half, and this average was 66.66%. PMID:8557540

  20. Nuclear and radiochemical techniques in chemical analysis. Final report

    SciTech Connect

    Finston, H.L.; Williams, E.T.

    1981-06-01

    The areas studied during the period of the contract included determinations of cross sections for nuclear reactions, determination of neutron capture cross sections of radionuclides, application of special activation techniques, and x-ray counting, elucidation of synergic solvent extraction mechanisms and development of new solvent extraction techniques, and the development of a PIXE analytical facility. The thermal neutron capture cross section of /sup 22/Na was determined, and cross sections and energy levels were determined for /sup 20/Ne(n,..cap alpha..)/sup 17/O, /sup 20/Ne(n,P)/sup 20/F, and /sup 40/Ar(n,..cap alpha..)/sup 37/S. Inelastic scattering with 2 to 3 MeV neutrons followed by counting of the metastable states permits analysis of the following elements: In, Sr, Cd, Hg, and Pb. Bromine can be detected in the presence of a 500-fold excess of Na and/or K by thermal neutron activation and x-ray counting, and as little as 0.3 x 10/sup -9/ g of Hg can be detected by this technique. Mediun energy neutrons (10 to 160 MeV) have been used to determine Tl, Pb, and Bi by (n,Xn) and (n,PXn) reactions. The reaction /sup 19/F(P,..cap alpha..)/sup 76/O has been used to determine as little as 50 ..mu..mol of Freon -14. Mechanisms for synergic solvent extractions have been elucidated and a new technique of homogeneous liquid-liquid solvent extraction has been developed in which the neutral complex is rapidly extracted propylene carbonate by raising and lowering the temperature of the system. An external-beam PIXE system has been developed for trace element analyses of a variety of sample types. Various sample preparation techniques have been applied to a diverse range of samples including marine sediment, coral, coal, and blood.

  1. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  2. Reduction and analysis techniques for infrared imaging data

    NASA Technical Reports Server (NTRS)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  3. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  4. Techniques for Improving Filters in Power Grid Contingency Analysis

    SciTech Connect

    Adolf, Robert D.; Haglin, David J.; Halappanavar, Mahantesh; Chen, Yousu; Huang, Zhenyu

    2011-12-31

    In large-scale power transmission systems, predicting faults and preemptively taking corrective action to avoid them is essential to preventing rolling blackouts. The computational study of the constantly-shifting state of the power grid and its weaknesses is called contingency analysis. Multiple-contingency planning in the electrical grid is one example of a complex monitoring system where a full computational solution is operationally infeasible. We present a general framework for building and evaluating resource-aware models of filtering techniques for this type of monitoring.

  5. Prompt gamma activation analysis: An old technique made new

    SciTech Connect

    English, Jerry; Firestone, Richard; Perry, Dale; Leung, Ka-Ngo; Reijonen, Jani; Garabedian, Glenn; Bandong, Bryan; Molnar, Gabor; Revay, Zsolt

    2002-12-01

    The long list of contributors to the prompt gamma activation analysis (PGAA) project is important because it highlights the broad cast of active PGAA researchers from various facilities and backgrounds. PGAA is basically a simple process in principle that was traditionally difficult in application. It is an old technique that has for years been tied to and associated exclusively with nuclear reactor facilities, which has limited its acceptance as a general, analytical tool for identifying and quantifying elements or, more precisely, isotopes, whether radioactive or nonradioactive. Field use was not a viable option.

  6. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  7. Video detection and analysis techniques of transient astronomical phenomena

    NASA Technical Reports Server (NTRS)

    Clifton, K. S.; Reese, R., Jr.; Davis, C. W.

    1979-01-01

    Low-light-level television systems have been utilized to gain information on meteors, aurorae, and other faint, transient astronomical phenomena. Such phenomena change not only their position as a function of time, but also their photometric and spectral characteristics in as little as 1/60 second, thus requiring unique methods of analysis. Data observed with television systems and recorded on video tape have been analyzed with a system utilizing both analog and digital techniques. Both off-the-shelf equipment and inhouse developments are used to isolate sequences of moving images and to store them in a form suitable for photometric and spectral reduction. Current emphasis of the analysis effort is directed at the measurement of the first-order emission lines of meteor spectra, the results of which will yield important compositional information concerning the nature of the impinging meteoroid.

  8. Sensitivity analysis techniques for models of human behavior.

    SciTech Connect

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  9. Validation of Design and Analysis Techniques of Tailored Composite Structures

    NASA Technical Reports Server (NTRS)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  10. Analysis of signal processing techniques in pulsed thermography

    NASA Astrophysics Data System (ADS)

    Lopez, Fernando; Ibarra-Castanedo, Clemente; Maldague, Xavier; de Paulo Nicolau, Vicente

    2013-05-01

    Pulsed Thermography (PT) is one of the most widely used approaches for the inspection of composites materials, being its main attraction the deployment in transient regime. However, due to the physical phenomena involved during the inspection, the signals acquired by the infrared camera are nearly always affected by external reflections and local emissivity variations. Furthermore, non-uniform heating at the surface and thermal losses at the edges of the material also represent constraints in the detection capability. For this reason, the thermographics signals should be processed in order to improve - qualitatively and quantitatively - the quality of the thermal images. Signal processing constitutes an important step in the chain of thermal image analysis, especially when defects characterization is required. Several of the signals processing techniques employed nowadays are based on the one-dimensional solution of Fourier's law of heat conduction. This investigation brings into discussion the three-most used techniques based on the 1D Fourier's law: Thermographic Signal Reconstruction (TSR), Differential Absolute Contrast (DAC) and Pulsed Phase Thermography (PPT), applied on carbon fiber laminated composites. It is of special interest to determine the detection capabilities of each technique, allowing in this way more reliable results when performing an inspection by PT.

  11. Envelopment technique and topographic overlays in bite mark analysis

    PubMed Central

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  12. Underreporting of maritime accidents to vessel accident databases.

    PubMed

    Hassel, Martin; Asbjørnslett, Bjørn Egil; Hole, Lars Petter

    2011-11-01

    Underreporting of maritime accidents is a problem not only for authorities trying to improve maritime safety through legislation, but also to risk management companies and other entities using maritime casualty statistics in risk and accident analysis. This study collected and compared casualty data from 01.01.2005 to 31.12.2009, from IHS Fairplay and the maritime authorities from a set of nations. The data was compared to find common records, and estimation of the true number of occurred accidents was performed using conditional probability given positive dependency between data sources, several variations of the capture-recapture method, calculation of best case scenario assuming perfect reporting, and scaling up a subset of casualty information from a marine insurance statistics database. The estimated upper limit reporting performance for the selected flag states ranged from 14% to 74%, while the corresponding estimated coverage of IHS Fairplay ranges from 4% to 62%. On average the study results document that the number of unreported accidents makes up roughly 50% of all occurred accidents. Even in a best case scenario, only a few flag states come close to perfect reporting (94%). The considerable scope of underreporting uncovered in the study, indicates that users of statistical vessel accident data should assume a certain degree of underreporting, and adjust their analyses accordingly. Whether to use correction factors, a safety margin, or rely on expert judgment, should be decided on a case by case basis. PMID:21819835

  13. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish

  14. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    PubMed

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable. PMID:23089799

  15. The Fourier analysis technique and epsilon-pseudo-eigenvalues

    SciTech Connect

    Donato, J.M.

    1993-07-01

    The spectral radii of iteration matrices and the spectra and condition numbers of preconditioned systems are important in forecasting the convergence rates for iterative methods. Unfortunately, the spectra of iteration matrices or preconditioned systems is rarely easily available. The Fourier analysis technique has been shown to be a useful tool in studying the effectiveness of iterative methods by determining approximate expressions for the eigenvalues or condition numbers of matrix systems. For non-symmetric matrices the eigenvalues may be highly sensitive to perturbations. The spectral radii of nonsymmetric iteration matrices may not give a numerically realistic indication of the convergence of the iterative method. Trefethen and others have presented a theory on the use of {epsilon}-pseudo-eigenvalues in the study of matrix equations. For Toeplitz matrices, we show that the theory of c-pseudo-eigenvalues includes the Fourier analysis technique as a limiting case. For non-Toeplitz matrices, the relationship is not clear. We shall examine this relationship for non-Toeplitz matrices that arise when studying preconditioned systems for methods applied to a two-dimensional discretized elliptic differential equation.

  16. Homogenization techniques for the analysis of porous SMA

    NASA Astrophysics Data System (ADS)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  17. Morphometric techniques for orientation analysis of karst in northern Florida

    SciTech Connect

    Jenkins, D.T.; Beck, B.F.

    1985-01-01

    Morphometric techniques for the analysis of karst landscape orientation data based on swallet catchment areas can be highly inadequate. The long axes of catchment areas may not coincide with structural control, especially in regions having very low relief. Better structural correlation was observed using multiply linear trend measurements of closed depressions rather than drainage basins. Trend analysis was performed on four areas, approximately 25 km/sup 2/ each, forming a sequence from the Suwannee River to the Cody Escarpment in northern Florida. This area is a karst plain, mantled by 12 to 25 meters of unconsolidated sands and clays. Structural control was examined by tabulating the azimuths of distinct linear trends as determined from depression shape based on 1:24,000 topographic maps. The topography was characterized by 1872 individual swallet catchment areas or 1457 closed depressions. The common geomorphic technique of analyzing orientation data in 10/sup 0/ increments beginning with O/sup 0/ may yield incorrect peak width and placement. To correctly detect all significant orientation peaks all possible combinations of peak width and placement must be tested. Fifty-five different plots were reviewed and tested for each area.

  18. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    SciTech Connect

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  19. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  20. Thermographic techniques applied to solar collector systems analysis

    SciTech Connect

    Eden, A.

    1980-02-01

    The use of thermography to analyze large solar collector array systems under dynamic operating conditions is discussed. The research at the Solar Energy Research Institute (SERI) in this area has focused on thermographic techniques and equipment to determine temperature distributions, flow patterns, and air blockages in solar collectors. The results of this extensive study, covering many sites and types of collectors, illustrate the capabilities of infrared (IR) analysis as a qualitative analysis tool and operation and maintenance procedure when applied to large arrays. Thermographic analysis of most collector systems qualitatively showed relative temperature distributions that indicated balanced flow patterns. In three significant cases, blocked or broken collector arrays, which previously had gone undetected, were discovered. Using this analysis, validation studies of large computer codes could examine collector arrays for flow patterns or blockages that could cause disagreement between actual and predicted performance. Initial operation and balancing of large systems could be accomplished without complicated sensor systems not needed for normal operations. Maintenance personnel could quickly check their systems without climbing onto the roof and without complicated sensor systems.

  1. Evaluation of energy system analysis techniques for identifying underground facilities

    SciTech Connect

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C.

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  2. Analysis of Road Traffic Accident Rate in the Slovak Republic and Possibilities of Its Reduction through Telematic Applications

    NASA Astrophysics Data System (ADS)

    Kalašová, Alica; Krchová, Zuzana

    There is a worldwide trend of increasing number of vehicles being newly introduced on an annual basis. On the road, there are great numbers of drivers with low discipline, aggressive driving habits and non-compliance with basic principles of responsible traffic behavior. As a consequence, the number of traffic accidents is rising, and so do their repercussions.

  3. High-level power analysis and optimization techniques

    NASA Astrophysics Data System (ADS)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  4. Digital Instrumentation and Control Failure Events Derivation and Analysis by Frame-Based Technique

    SciTech Connect

    Hui-Wen Huang; Chunkuan Shih; Swu Yih; Yen-Chang Tzeng; Ming-Huei Chen

    2006-07-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  5. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    PubMed

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample. PMID:27030469

  6. A real-time, dynamic early-warning model based on uncertainty analysis and risk assessment for sudden water pollution accidents.

    PubMed

    Hou, Dibo; Ge, Xiaofan; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2014-01-01

    A real-time, dynamic, early-warning model (EP-risk model) is proposed to cope with sudden water quality pollution accidents affecting downstream areas with raw-water intakes (denoted as EPs). The EP-risk model outputs the risk level of water pollution at the EP by calculating the likelihood of pollution and evaluating the impact of pollution. A generalized form of the EP-risk model for river pollution accidents based on Monte Carlo simulation, the analytic hierarchy process (AHP) method, and the risk matrix method is proposed. The likelihood of water pollution at the EP is calculated by the Monte Carlo method, which is used for uncertainty analysis of pollutants' transport in rivers. The impact of water pollution at the EP is evaluated by expert knowledge and the results of Monte Carlo simulation based on the analytic hierarchy process. The final risk level of water pollution at the EP is determined by the risk matrix method. A case study of the proposed method is illustrated with a phenol spill accident in China. PMID:24781332

  7. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    SciTech Connect

    Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A.; Jackson, R.; TenBrook, W.; Russell, J. |

    1994-02-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force.

  8. Analysis of 129I in the soils of Fukushima Prefecture: preliminary reconstruction of 131I deposition related to the accident at Fukushima Daiichi Nuclear Power Plant (FDNPP).

    PubMed

    Muramatsu, Yasuyuki; Matsuzaki, Hiroyuki; Toyama, Chiaki; Ohno, Takeshi

    2015-01-01

    Iodine-131 is one of the most critical radionuclides to be monitored after release from reactor accidents due to the tendency for this nuclide to accumulate in the human thyroid gland. However, there are not enough data related to the reactor accident in Fukushima, Japan to provide regional information on the deposition of this short-lived nuclide (half-life = 8.02 d). In this study we have focused on the long-lived iodine isotope, (129)I (half-life of 1.57 × 10(7) y), and analyzed it by accelerator mass spectrometry (AMS) for surface soil samples collected at various locations in Fukushima Prefecture. In order to obtain information on the (131)I/(129)I ratio released from the accident, we have determined (129)I concentrations in 82 soil samples in which (131)I concentrations were previously determined. There was a strong correlation (R(2) = 0.84) between the two nuclides, suggesting that the (131)I levels in soil samples following the accident can be estimated through the analysis of (129)I. We have also examined the possible influence from (129m)Te on (129)I, and found no significant effect. In order to construct a deposition map of (131)I, we determined the (129)I concentrations (Bq/kg) in 388 soil samples collected from different locations in Fukushima Prefecture and the deposition densities (Bq/m(2)) of (131)I were reconstructed from the results. PMID:24930438

  9. Damage identification techniques via modal curvature analysis: Overview and comparison

    NASA Astrophysics Data System (ADS)

    Dessi, Daniele; Camerlengo, Gabriele

    2015-02-01

    This paper aims to compare several damage identification methods based on the analysis of modal curvature and related quantities (natural frequencies and modal strain energy) by evaluating their performances on the same test case, a damaged Euler-Bernoulli beam. Damage is modelled as a localized and uniform reduction of stiffness so that closed-form expressions of the mode-shape curvatures can be analytically computed and data accuracy, which affects final results, can be controlled. The selected techniques belong to two categories: one includes several methods that need reference data for detecting structural modifications due to damage, the second group, including the modified Laplacian operator and the fractal dimension, avoids the knowledge of the undamaged behavior for issuing a damage diagnosis. To explain better the different performances of the methods, the mathematical formulation has been revised in some cases so as to fit into a common framework where the underlying hypotheses are clearly stated. Because the various damage indexes are calculated on 'exact' data, a sensitivity analysis has been carried out with respect to the number of points where curvature information is available, to the position of damage between adjacent points, to the modes involved in the index computation. In this way, this analysis intends to point out comparatively the capability of locating and estimating damage of each method along with some critical issues already present with noiseless data.

  10. Methods and Techniques for miRNA Data Analysis.

    PubMed

    Cristiano, Francesca; Veltri, Pierangelo

    2016-01-01

    Genomic data analysis consists of techniques to analyze and extract information from genes. In particular, genome sequencing technologies allow to characterize genomic profiles and identify biomarkers and mutations that can be relevant for diagnosis and designing of clinical therapies. Studies often regard identification of genes related to inherited disorders, but recently mutations and phenotypes are considered both in diseases studies and drug designing as well as for biomarkers identification for early detection.Gene mutations are studied by comparing fold changes in a redundancy version of numeric and string representation of analyzed genes starting from macromolecules. This consists of studying often thousands of repetitions of gene representation and signatures identified by biological available instruments that starting from biological samples generate arrays of data representing nucleotides sequences representing known genes in an often not well-known sequence.High-performance platforms and optimized algorithms are required to manipulate gigabytes of raw data that are generated by the so far mentioned biological instruments, such as NGS (standing for Next-Generation Sequencing) as well as for microarray. Also, data analysis requires the use of several tools and databases that store gene targets as well as gene ontologies and gene-disease association.In this chapter we present an overview of available software platforms for genomic data analysis, as well as available databases with their query engines. PMID:26069024

  11. Frequency Analysis Techniques for Identification of Viral Genetic Data

    PubMed Central

    Trifonov, Vladimir; Rabadan, Raul

    2010-01-01

    Environmental metagenomic samples and samples obtained as an attempt to identify a pathogen associated with the emergence of a novel infectious disease are important sources of novel microorganisms. The low costs and high throughput of sequencing technologies are expected to allow for the genetic material in those samples to be sequenced and the genomes of the novel microorganisms to be identified by alignment to those in a database of known genomes. Yet, for various biological and technical reasons, such alignment might not always be possible. We investigate a frequency analysis technique which on one hand allows for the identification of genetic material without relying on alignment and on the other hand makes possible the discovery of nonoverlapping contigs from the same organism. The technique is based on obtaining signatures of the genetic data and defining a distance/similarity measure between signatures. More precisely, the signatures of the genetic data are the frequencies of k-mers occurring in them, with k being a natural number. We considered an entropy-based distance between signatures, similar to the Kullback-Leibler distance in information theory, and investigated its ability to categorize negative-sense single-stranded RNA (ssRNA) viral genetic data. Our conclusion is that in this viral context, the technique provides a viable way of discovering genetic relationships without relying on alignment. We envision that our approach will be applicable to other microbial genetic contexts, e.g., other types of viruses, and will be an important tool in the discovery of novel microorganisms. PMID:20824103

  12. Novel technique for coal pyrolysis and hydrogenation product analysis

    SciTech Connect

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  13. Reliability analysis of laminated CMC components through shell subelement techniques

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Gyekenyesi, John P.

    1992-01-01

    An updated version of the integrated design program Composite Ceramics Analysis and Reliability Evaluation of Structures (C/CARES) was developed for the reliability evaluation of ceramic matrix composites (CMC) laminated shell components. The algorithm is now split into two modules: a finite-element data interface program and a reliability evaluation algorithm. More flexibility is achieved, allowing for easy implementation with various finite-element programs. The interface program creates a neutral data base which is then read by the reliability module. This neutral data base concept allows easy data transfer between different computer systems. The new interface program from the finite-element code Matrix Automated Reduction and Coupling (MARC) also includes the option of using hybrid laminates (a combination of plies of different materials or different layups) and allows for variations in temperature fields throughout the component. In the current version of C/CARES, a subelement technique was implemented, enabling stress gradients within an element to be taken into account. The noninteractive reliability function is now evaluated at each Gaussian integration point instead of using averaging techniques. As a result of the increased number of stress evaluation points, considerable improvements in the accuracy of reliability analyses were realized.

  14. Transit Spectroscopy: new data analysis techniques and interpretation

    NASA Astrophysics Data System (ADS)

    Tinetti, Giovanna; Waldmann, Ingo P.; Morello, Giuseppe; Tessenyi, Marcell; Varley, Ryan; Barton, Emma; Yurchenko, Sergey; Tennyson, Jonathan; Hollis, Morgan

    2014-11-01

    Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. A key observable for planets is the chemical composition and state of their atmosphere. To date, two methods can be used to sound exoplanetary atmospheres: transit and eclipse spectroscopy, and direct imaging spectroscopy. Although the field of exoplanet spectroscopy has been very successful in past years, there are a few serious hurdles that need to be overcome to progress in this area: in particular instrument systematics are often difficult to disentangle from the signal, data are sparse and often not recorded simultaneously causing degeneracy of interpretation. We will present here new data analysis techniques and interpretation developed by the “ExoLights” team at UCL to address the above-mentioned issues. Said techniques include statistical tools, non-parametric, machine-learning algorithms, optimized radiative transfer models and spectroscopic line-lists. These new tools have been successfully applied to existing data recorded with space and ground instruments, shedding new light on our knowledge and understanding of these alien worlds.

  15. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  16. One-Dimensional Analysis Techniques for Pulsed Blowing Distribution

    NASA Astrophysics Data System (ADS)

    Chambers, Frank

    2005-11-01

    Pulsed blowing offers reductions in bleed air requirements for aircraft flow control. Efficient pulsed blowing systems require careful design to minimize bleed air use while distributing blowing to multiple locations. Pulsed blowing systems start with a steady flow supply and process it to generate a pulsatile flow. The fluid-acoustic dynamics of the system play an important role in overall effectiveness. One-dimensional analysis techniques that in the past have been applied to ventilation systems and internal combustion engines have been adapted to pulsed blowing. Pressure wave superposition and reflection are used with the governing equations of continuity, momentum and energy to determine particle velocities and pressures through the flow field. Simulations have been performed to find changes in the amplitude and wave shape as pulses are transmitted through a simple pulsed blowing system. A general-purpose code is being developed to simulate wave transmission and allow the determination of blowing system dynamic parameters.

  17. Application of Wavelet Unfolding Technique in Neutron Spectroscopic Analysis

    NASA Astrophysics Data System (ADS)

    Hartman, Jessica; Barzilov, Alexander

    Nonproliferation of nuclear materials is important in nuclear power industry and fuel cycle facilities. It requires technologies capable of measuring and assessing the radiation signatures of fission events. Neutrons produced in spontaneous or induced fission reactions are mainly fast. The neutron energy information allows characterization of nuclear materials and neutron sources. It can also be applied in remote sensing and source search tasks. The plastic scintillator EJ-299-33A was studied as a fast neutron detector. The detector response to a polyenergetic flux was unfolded usingthe multiple linear regression method. It yields the intensities of neutron flux of particular energy, hence, enabling the spectroscopic analysis. The wavelet technique was evaluated for the unfolding of neutron spectrum using the scintillator's response functions between 1 MeV and 14 MeV computed with the MCNPX code. This paperpresents the computational results of the wavelet-based spectrum unfolding applied to a scintillator detector with neutron / photon pulse shape discrimination properties.

  18. Radial Velocity Data Analysis with Compressed Sensing Techniques

    NASA Astrophysics Data System (ADS)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  19. Analysis techniques for background rejection at the Majorana Demonstrator

    SciTech Connect

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray; Xu, Wenqin; Goett, John Jerome III

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  20. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    NASA Astrophysics Data System (ADS)

    Cuesta, C.; Abgrall, N.; Arnquist, I. J.; Avignone, F. T.; Baldenegro-Barrera, C. X.; Barabash, A. S.; Bertrand, F. E.; Bradley, A. W.; Brudanin, V.; Busch, M.; Buuck, M.; Byram, D.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Detwiler, J. A.; Efremenko, Yu.; Ejiri, H.; Elliott, S. R.; Galindo-Uribarri, A.; Gilliss, T.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guinn, I. S.; Guiseppe, V. E.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Jasinski, B. R.; Keeter, K. J.; Kidd, M. F.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; MacMullin, J.; Martin, R. D.; Meijer, S. J.; Mertens, S.; Orrell, J. L.; O'Shaughnessy, C.; Poon, A. W. P.; Radford, D. C.; Rager, J.; Rielage, K.; Robertson, R. G. H.; Romero-Romero, E.; Shanks, B.; Shirchenko, M.; Snyder, N.; Suriano, A. M.; Tedeschi, D.; Trimble, J. E.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Wiseman, C.; Xu, W.; Yakushev, E.; Yu, C.-H.; Yumatov, V.; Zhitnikov, I.

    2015-08-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.